Creating a Loki Splunk application

One tool that has caught my interest is the Loki APT scanner created by BSK Consulting, a cool scanner that combines filenames, IP addresses, domains, hashes, Yara rules, Regin file system checks, process anomaly checks, SWF decompressed scan, SAM dump checks, etc. to find indicators of compromise on your system. From the Loki github page, Loki currently includes the following IOC checks:

  • Equation Group Malware (Hashes, Yara Rules by Kaspersky and 10 custom rules generated by us)
  • Carbanak APT - Kaspersky Report (Hashes, Filename IOCs - no service detection and Yara rules)
  • Arid Viper APT - Trendmicro (Hashes)
  • Anthem APT Deep Panda Signatures (not officialy confirmed) (krebsonsecurity.com - see Blog Post)
  • Regin Malware (GCHQ / NSA / FiveEyes) (incl. Legspin and Hopscotch)
  • Five Eyes QUERTY Malware (Regin Keylogger Module - see: Kaspesky Report)
  • Skeleton Key Malware (other state-sponsored Malware) - Source: Dell SecureWorks Counter Threat Unit(TM)
  • WoolenGoldfish - (SHA1 hashes, Yara rules) Trendmicro Report
  • OpCleaver (Iranian APT campaign) - Source: Cylance
  • More than 180 hack tool Yara rules - Source: APT Scanner THOR
  • More than 600 web shell Yara rules - Source: APT Scanner THOR
  • Numerous suspicious file name regex signatures - Source: APT Scanner THOR
  • Much more ... (cannot update the list as fast as I include new signatures)

The challenge with Loki is that it can be very laborious to run and parse Loki's scan results across an enterprise to find the needle in a haystack . In this post we'll show how to write a Splunk app to automate running Loki, parsing the results, and identifying what is important. But as an FYI, Loki is a CLI-based program that has the ability to scan a folder, your system, etc. for possible indicators of compromise. Basic Loki commands are:

usage: loki.exe [-h] [-p path] [-s kilobyte] [-l log-file] [-a alert-level]
[-w warning-level] [-n notice-level] [--printAll]
[--allreasons] [--noprocscan] [--nofilescan] [--noindicator]
[--reginfs] [--dontwait] [--intense] [--csv] [--onlyrelevant]
[--nolog] [--update] [--debug]

Loki - Simple IOC Scanner

optional arguments:
-h, --helpshow this help message and exit
-p path Path to scan
-s kilobyte Maximum file size to check in KB (default 2048 KB)
-l log-file Log file
-a alert-levelAlert score
-w warning-levelWarning score
-n notice-level Notice score
--printAllPrint all files that are scanned
--allreasonsPrint all reasons that caused the score
--noprocscanSkip the process scan
--nofilescanSkip the file scan
--noindicator Do not show a progress indicator
--reginfs Do check for Regin virtual file system
--dontwaitDo not wait on exit
--intense Intense scan mode (also scan unknown file types and all
extensions)
--csv Write CSV log format to STDOUT (machine prcoessing)
--onlyrelevantOnly print warnings or alerts
--nolog Don't write a local log file
--updateUpdate the signatures from the "signature-base" sub
repository
--debug Debug output

Before we begin with the steps to create the Splunk app, download the latest Loki Windows binary from here. For the sake of this blog post we will only be focusing on running Loki in Windows, but the functionality can easily be extended to all operating systems. Once downloaded, run the command "loki.exe --update" to download the latest IOC files into the folder "signature-base" that will be used later.

On a side note, if you would like to further update your IOCs to include Alienware malicious IPs and domains and MISP IOCs, use the signature update files located in "signature-base\threatintel". You will require an API key from Alienvault Open Threat Exchange (OTX), and a MISP API key from a running MISP instance. The AlienVault API key is easy to get, the MISP instance is a little more difficult. In any case, once you have your keys you can write a script that updates either or both services and schedule a Cron job with the following commands:

# Update AlienVault OTX:
python get-otx-iocs.py -k <API_KEY>

# Update MISP:
python get-misp-iocs.py -k <API_KEY> -u <URL>

Now that we have an updated Loki executable and signatures we are ready to create the Splunk App directory structure. In your Splunk Deployment Server create the following directories and files:

The following 
$SPLUNK_HOME/etc/deployment-apps/Splunk_App_loki
├── bin
| ├── config\
| ├── excludes.cfg
| ├── signature-base\*
| ├── loki.bat
| └── loki.exe
├── default
| ├── app.conf
| ├── indexes.conf
| ├── props.conf
| ├── transforms.conf
| └── inputs.conf
├── metadata
| └── default.meta

Now that we have our directory structure, there are a couple of default files that will need to be created that we'll run through quickly:
1) $SPLUNK_HOME\etc\deployment_apps\Splunk_App_loki\bin\signature-base\*

# Copy the folder and its content created via the command "loki.exe --update"to the specified folder.

2) $SPLUNK_HOME\etc\deployment_apps\Splunk_App_loki\bin\config\excludes.cfg
This is actually a Loki default file, but should be included none the less.

# Excluded directories
#
# Ensure that you have the latest file from the excludes.cfg URL above.
#
# - add directories you want to exclude from the scan
# - double escape back slashes
# - values are case-insensitive
# - remember to use back slashes on Windows and slashes on Linux / Unix / OSX
# - each line contains a regex that matches somewhere in the full path (case insensitive)
# e.g.:
# Regex: \\System32\\
# Matches C:\Windows\System32\cmd.exe
#
# Regex: /var/log/[^/]+\.log
# Matches: /var/log/test.log
# Not Matches: /var/log/test.gz
#

# Useful examples (google "antivirus exclusion recommendations" to find more)
\\Ntfrs\\
\\Ntds\\
\\EDB[^\.]+\.log
Sysvol\\Staging\\Nntfrs_cmp
\\System Volume Information\\DFSR

$SPLUNK_HOME\etc\deployment_apps\Splunk_App_loki\default\app.conf
The app.conf file maintains the state of a given app in Splunk Enterprise. It may also be used to customize certain aspects of an app.

## Splunk app configuration file

[install]
is_configured = true
state = enabled

[launcher]
author = epicism
version = 1.0
description = Technology Add-on for the Loki APT Scanner

[ui]
is_visible = false
label = Technology Add-on for Loki APT Scanner

[package]
id = Splunk_App_loki

$SPLUNK_HOME\etc\deployment_apps\Splunk_App_loki\metadata\default.meta
The default.meta file contain ownership information, access controls, and export settings for Splunk objects like saved searches, event types, and views. Each app has its own default.meta file.

[]
access = read : [*], write: [ admin ]
export = system

Now that we have the default files out of the way we can create the Loki-specific configuration files. First is the inputs.conf file that runs the script that executes the loki.exe binary and reads the loki scan results.

 

$SPLUNK_HOME\etc\deployment_apps\Splunk_App_loki\default\inputs.conf
The inputs.conf file contains possible settings you can use to configure inputs, distributed inputs such as forwarders, and file system monitoring in inputs.conf.

# This is where you would place your signature update script if you created it:
# [script://$SPLUNK_HOME\etc\apps\loki\bin\signature-base\threatintel\updateintel.bat]
# disabled = true
# index = main
# interval = 30 1 * * *
# sourcetype = lokirun

# This entry runs the loki batch script and sends the script output to a null index.
# I could not get loki.exe's output to be ingested by Splunk when running it from this script,
# so I routed loki.exe's output to the $SPLUNK_HOME\...\loki.log in the next stanza.
[script://$SPLUNK_HOME\etc\apps\Splunk_App_loki\bin\loki.bat]
disabled = false
index = main
interval = 0 0 2 * * ?
sourcetype = lokirun
queueSize = 50MB

# The loki.bat batch script will save the loki.exe output to $SPLUNK_HOME\var\log\loki.log, and this reads it.
[monitor://$SPLUNK_HOME\var\log\splunk\loki.log]
disabled = false
index = loki
sourcetype = loki

$SPLUNK_HOME\etc\deployment_apps\Splunk_App_loki\bin\loki.bat
This script moves its current working directory to the location of the script, overwrites loki.log to ensure that it doesn't grow endlessly and runs loki.exe. "..\..\..\..\var\log\splunk\" saves the output log in Splunk's log directory.

cd /d %~dp0
> ..\..\..\..\var\log\splunk\loki.log echo.
start /low /d "%~dp0" loki.exe --reginfs --csv --dontwait --onlyrelevant --noindicator --intense -l ..\..\..\..\var\log\splunk\loki.log

The following files are the configuration files used by the Splunk Search Head to parse the Loki log files. The Loki log files are supposed to be CSV format, but only the first half of the values are, which required me to be creative when parsing the event logs. Props.conf will parse the first half of the properly CSV separated log, and transforms.conf parses the rest of the line.

$SPLUNK_HOME\etc\deployment_apps\Splunk_App_loki\default\props.conf
A little on Props.conf - it is commonly used for:

  • Configuring line breaking for multi-line events;
  • Setting up character set encoding;
  • Allowing processing of binary files;
  • Configuring timestamp recognition;
  • Configuring event segmentation;
  • Overriding automated host and source type matching;
  • Configure advanced (regex-based) host and source type overrides;
  • Override source type matching for data from a particular source;
  • Set up rule-based source type recognition;
  • Rename source types;
  • And so on...

Props.conf is an integral part of a Splunk app, and I recommend that you read the props.conf description in the URL above if you're not familiar with it.

 

# This is for data that we don't want ingested to Splunk
[lokirun]
DATETIME_CONFIG = CURRENT
LINE_BREAKER = ([\r\n]+)
SHOULD_LINEMERGE = false
disabled = 0
TRANSFORMS-null= setnull

#This entry parses the loki.exe "CSV" output
[loki]
TIME_PREFIX = ^
TIME_FORMAT = %Y%m%dT%H:%M:%SZ
MAX_TIMESTAMP_LOOKAHEAD = 25
DATETIME_CONFIG = CURRENT
LINE_BREAKER = ([\r\n]+)
SHOULD_LINEMERGE = false
disabled = 0

# Example Log: 20170219T15:46:53Z,WIN-8J1HPPNE2HB,ALERT,FILE: C:\Users\x\Downloads\FlokiBot\64a23908ade4bbf2a7c4aa31be3cff24 SCORE: 100 TYPE: EXE SIZE: 400896 FIRST_BYTES: 4d5a90000300000004000000ffff0000b8000000 / MZ MD5: 64a23908ade4bbf2a7c4aa31be3cff24 SHA1: 2f87c2ce9ae1b741ac5477e9f8b786716b94afc5 SHA256: a4a810eebd2fae1d088ee62af725e39717ead68140c4c5104605465319203d5e CREATED: Tue Feb 07 13:45:11 2017 MODIFIED: Tue Feb 07 07:37:00 2017 ACCESSED: Tue Feb 07 13:45:11 2017REASON_1: Malware Hash TYPE: MD5 HASH: 64a23908ade4bbf2a7c4aa31be3cff24 SUBSCORE: 100 DESC: Flokibot Invades PoS: Trouble in Brazil https://www.arbornetworks.com/blog/asert/flokibot-invades-pos-trouble-brazil/
# EXTRACT-00-HEADER extracts the properly CSV values at the start of the log, and the REPORT-00-KEYVALUES transforms.conf entry parses the rest of the line.
EXTRACT-00-HEADER = ^(?<DATE>\d+)T(?<TIME>\d+:\d+:\d+)Z,(?<HOSTNAME>[^,]+),(?<SEVERITY>[^,]+),
# REPORT-00-KEYVALUES is responsible for parsing the remaining portion of the Loki event log not parsed by EXTRACT-00-HEADER. transforms.conf is good at parsing repeating values (such as "x=y" * z patterns), which is how Loki outputs its scan results.
REPORT-00-KEYVALUES = trans_keyvalues

$SPLUNK_HOME\etc\deployment_apps\Splunk_App_loki\default\transforms.conf
A little on transforms.conf - it is commonly used for:

  • Configuring regex-based host and source type overrides;
  • Anonymizing certain types of sensitive incoming data, such as credit card or social security numbers;
  • Routing specific events to a particular index, when you have multiple indexes;
  • Creating new index-time field extractions. NOTE: We do not recommend adding to the set of fields that are extracted at index time unless it is absolutely necessary because there are negative performance implications;
  • And a lot more...

Like props.conf, transforms.conf is an integral configuration file to an app and I recommend that you read up on the URL to better understand the configuration file's function.

 

# This is supposed to remove Loki's process bar entries
[setnull]
REGEX = ^[\\\|\-\/\b]+$
DEST_KEY = queue
FORMAT = nullQueue

# This removes the loki.exe execution entry
[setnull2]
REGEX = ^.*?\\etc\\apps\\loki\\bin>loki\.exe --reginfs --csv --dontwait --onlyrelevant --intense\s+$
DEST_KEY = queue
FORMAT = nullQueue

# "REGEX = XXX" parses the "key=value" pattern that isn't comma separated by performing a look ahead to detect the next "key=" entry. 
# FORMAT = $1::$2 tells Splunk that the key/value is to be formatted based on the first group that the regex extracts as the key, and the second group that the regex extracts as the value.
# Message me if you would like a deeper breakdown of how this works, and I would be happy to explain it.
[trans_keyvalues]
REGEX = ([\w\d]+):\s(.*?)(?=((\s[\d\w]+:\s)|$))
FORMAT = $1::$2

And that's it! Simple, right? It may be overwhelming if you're new to Splunk apps, but the main thing that you should know is that inputs.conf runs loki.bat (that runs loki.exe) and monitors for the loki.log file to be updated with the scan results. props.conf parses the first half of the Loki event log, and transforms.conf parses the rest. Hopefully this is helpful.


Now we have a full app in your Splunk deployment server, re-deploy your deployment server apps using the command:

$SPLUNK_HOME/bin/splunk reload deploy-server

Now you should be able to see Splunk_App_loki in your Deployment server. Go to Settings -> Forwarder Management -> Apps, find Splunk_App_loki and click Edit.

 

 

Once in the App configuration section select the Reset Splunk checkbox and select Save.

 

Next go to the Server Class tab and create a new App by slicking New Server Class.

 

Name it Loki_App_Class (or whatever you want) and click OK. This will bring you to the Loki App Class screen:

 

Note, if you chose to create the Splunk_TA_loki app, you can perform the same steps as above and add your search head to the clients list, or using the cluster manager.


In the Apps section select of the page click Edit to take you to the App list page. Click on the Splunk_App_loki app in the left hand side list to add it to the app class and click Save:

 

This will take you back to the Loki_App_Class page. Next you will add the clients that you want to run the Loki APT scanner on. Click the Edit button on the Clients section of the page to take you to the list of clients (e.g. Splunk servers and Splunk Universal Forwarder servers). Add the Windows clients that you want to run Loki on on a regular schedule by adding their hostname to the Include (whitelist) textbox and click the Save button:

 



This will cause the clients in the Include (whitelist) of the  Loki_App_Class to download, install and run the Loki app the next time they call in to the deployment server every day at 2:00 AM, save the results to "$SPLUNK_HOME\var\log\splunk\loki.log" and then ingest and parse the results into Splunk, taking the following:

20170417T01:36:13Z,WIN-8J1HPPNE2HB,ALERT,FILE: C:\Program Files\SplunkUniversalForwarder\var\log\splunk\loki.log SCORE: 4630 TYPE: UNKNOWN SIZE: 281385 FIRST_BYTES: 32303137303431375430313a33333a33365a2c57 / 20170417T01:33:36Z,W MD5: 99bb9f6343fc69159a6e03e1ef8c6428 SHA1: 58bf43a5c0ec496e62f2217cfa789df35d1ea953 SHA256: 4e1feaa3b24529737fa5accda9beaa841fb259ed5474087aa1017f8427544c04 CREATED: Sun Apr 16 18:33:36 2017 MODIFIED: Sun Apr 16 18:34:46 2017 ACCESSED: Sun Apr 16 18:33:36 2017REASON_1: Yara Rule MATCH: GRIZZLY_STEPPE_Malware_2 SUBSCORE: 70 DESCRIPTION: Auto-generated rule - file 9acba7e5f972cdd722541a23ff314ea81ac35d5c0c758eb708fb6e2cc4f598a0 MATCHES: Str1: GoogleCrashReport.dll Str2: CrashErrors Str3: CrashSend Str4: CrashAddData Str5: CrashCleanup Str6: CrashInitREASON_2: Yara Rule MATCH: Casper_Included_Strings SUBSCORE: 50 DESCRIPTION: Casper French Espionage Malware - String Match in File - http://goo.gl/VRJNLo MATCHES: Str1: cmd.exe /C FOR /L %%i IN (1,1,%d) DO IF EXIST Str2: & SYSTEMINFO) ELSE EXIT Str3: jpic.gov.sy Str4: perfaudio.dat
20170417T01:38:59Z,WIN-8J1HPPNE2HB,WARNING,FILE: C:\Users\Administrator\AppData\Local\Google\Chrome\User Data\Default\Cache\f_0000ab SCORE: 70 TYPE: RAR SIZE: 257998 FIRST_BYTES: 526172211a0700cf907300000d00000000000000 / Rar!s MD5: b7bec1fe35e86afc5b00f2b72f684406 SHA1: c875243df43d7a0baababf7488df884acffae2f9 SHA256: f1209bbd5163a03c4543607a1ce2c69548fa6bddc977670fad845fc42216c69f CREATED: Mon Feb 06 09:11:44 2017 MODIFIED: Mon Feb 06 09:11:44 2017 ACCESSED: Mon Feb 06 09:11:44 2017REASON_1: Yara Rule MATCH: Cloaked_RAR_File SUBSCORE: 70 DESCRIPTION: RAR file cloaked by a different extension

and turning it into parsed key/value pairs that can be used to run reports that show all Loki Scan results that have a 70% confidence level and above, or to fire an alert on confidence levels of 100% :

 

Loki Parsed Logs

Conclusion
This is great, but, really, so what? What can we do with this information? The value in this post is in creating the ability to automate a manual task across your your enterprise. You no longer have to manually run the Loki APT scanner on each system across your environment and parse through the results for possible issues. Automate, explore, expand, exploit, and exterminate. With a sea of open source security tools that work well on a manual process, this solution can be an excellent method to provide a fresh insight into the workings, and malevolent workings, of an enterprise.

Fall of an Empire

While setting up my C2 nodes and redirectors for an engagement, I decided to explore Empire and Meterpreter's default setups. I thought to myself - who would be foolish to allow anyone on the Internet to connect to their C2 servers, much less what information could be extracted in a legal manner.  The results were interesting.  The first exploratory investigation revealed twenty+ C2 nodes running stock Empire or Meterpreter reverse http/s sessions.

History Of Failure

Coding is hard.  Empire and Metasploit projects have a history of Remote Code Execution vulnerabilities. Which means Red Teams need to go to great lengths in order to manage risks and mitigate evil entities from compromising their crown jewels which includes active agents and client data.

Empire RCE
Metasploit RCE

Empire

Empire, now in beta for 2.0 includes both Powershell Empire as well as the python version Empyre. The Empire listener is based on BaseHTTPServer in Python and provides an extraction layer on top of it. Let's take a look at the HTTP headers that are present in default Empire configuration.

Empire Headers

Using the HTTP request of GET / HTTP/1.1, the following headers were returned.

HTTP/1.0 200 OK
Server: Microsoft-IIS/7.5
Date: Wed, 05 Apr 2017 18:26:10 GMT

The thing that stands out here is the general lack of headers that would normally be present in a request. Also, the fact that we used HTTP/1.1 as the protocol, but the reply is still for HTTP/1.0

Empire Default Page

<html><body><h1>It works!</h1><p>This is the default web page for this server.</p><p>The web server software is running but no content has been added, yet.</p></body></html>

Hashes Of Defaul Page

MD5: 885ecd7910c988f1f15fcacca5e1734e
SHA1: b642227fbc703af1a67edb665241fc709ecd6f6e
SHA2: a58fb107072d9523114a1b1f17fbf5e7a8b96da7783f24d84f83df34abc48576

Finding Empire Listeners With Shodan

Shodan is a search engine for Security Researchers and other inquisitive mindsets.  Shodan routinely scans common ports across the Internet to enable public consumption and inquiry of the resulting data set.  APIs are also provided for those who wish to work smarter, not harder.  

Using the common headers, and default web page listed above, we are able to narrow down the list of possible Empire C2 nodes on the Internet with a simple query.

'Microsoft-IIS/7.5' 'It works!' -'Content-Type' -'Set-Cookie'

You'll notice that the results returned all are HTTP/1.0 with matching profiles that we scoped out above.

 

Finding An Exception In Empire

The HTTP module in Empire is located in lib/common/http.py. Go ahead and use your favorite text editor to open that up, and have a look around at the code.

In the class RequestHandler and method do_GET we have the following piece of code for handling parsing of cookie data.

if cookie:
# search for a SESSIONID value in the cookie
parts = cookie.split(";")
for part in parts:
if "SESSIONID" in part:
# extract the sessionID value
name, sessionID = part.split("=")

Interesting.
name, sessionID = part.split("=")
If there is more than one equal sign in the cookie field, it'll continue to split on equal signs. That line should be this. 
name, sessionID = part.split("=", 1)
In order to limit the number of items to one.

Let's go ahead and try to exploit this from the client side with the following request.

curl http://target:port --Cookie 'SESSIONID=id=id'

Curl will return the following error, because Python threw an exception upon parsing the cookies.

curl: (52) Empty reply from server

Changing Default Values

While executing a Red Team engagement, it is STRONGLY recommended to change the default values of tools that you use, whether it be a scanner or C2 infrastructure. This will make it harder for Blue Team elements to detect portions of your activity. You should also either utilize Empire's whitelisting feature or setup proper access control lists.  There is no excuse for leaving your C2 node exposed to the entire Internet.

You should have noticed while browsing http.py that the default page served is also located in that file in the function named default_page.

In order to change the default server name, you must edit the configuration in the empire.db file located in data/. Open it up by using sqlite3 data/empire.db. You can view the current setting by typing SELECT server_version from config;
In order to update it, something like the following will do the job.

update config set server_version = 'nginx' where server_version = 'Microsoft-IIS/7.5';

Going Beyond Shodan

Scans.io is another great resource for looking at Internet-wide scans including those for HTTPS sites. The scan sets are huge, but offer a very current view of HTTPs servers across the globe. Data is in JSON format, and the default page is saved in base64 format within each node.

zgrep 'PGh0bWw+PGJvZHk+PGgxPkl0IHdvcmtzITwvaDE+PHA+VGhpcyBpcyB0aGUgZGVmYXVsdCB3ZWIgcGFnZSBmb3IgdGhpcyBzZXJ2ZXIuPC9wPjxwPlRoZSB3ZWIgc2VydmVyIHNvZnR3YXJlIGlzIHJ1bm5pbmcgYnV0IG5vIGNvbnRlbnQgaGFzIGJlZW4gYWRkZWQsIHlldC48L3A+PC9ib2R5PjwvaHRtbD4=' 20170221-https.gz > /tmp/results.json

This may take several minutes to run, as the datasets are generally several gigabytes in size. The result will be a file containing JSON data for each host that returned the default Empire HTML. You can parse this file and extract each IP address that should be tested, and then feed them into the script below.

Automating Detection With Python

Use the following to run this script.

#!/usr/bin/env python3

from urllib.request import build_opener, HTTPSHandler

from http.client import RemoteDisconnected

from hashlib import sha256

from sys import argv, exit

from binascii import hexlify

import ssl

class NoException(Exception):

pass

steps = [

{

'url': 'https://{}/',

'response': 'a58fb107072d9523114a1b1f17fbf5e7a8b96da7783f24d84f83df34abc48576',

'exception': NoException

},

{

'url': 'https://{}/',

'cookie': 'SESSIONID=id=id',

'exception': RemoteDisconnected

}

]

def main():

if len(argv) != 2:

print("Usage: %s <ip>" % argv[0])

exit(1)

context = ssl._create_unverified_context()

for step in steps:

opener = build_opener(HTTPSHandler(context=context))

if 'cookie' in step:

opener.addheaders.append(('Cookie', step['cookie']))

try:

resp = opener.open(step['url'].format(argv[1]))

except step['exception']:

print("[+] Exception correctly called")

except Exception:

print("[!] Unexpected exception found")

print("[-] IP %s is not an Empire listener" % argv[1])

exit(1)

data = resp.read()

if 'response' in step:

shasum = sha256()

shasum.update(data)

if hexlify(shasum.digest()).decode('utf-8') == step['response']:

print("[+] Response matches")

else:

print("[!] Response doesn't match")

print("[-] IP %s is not an Empire listener" % argv[1])

exit(1)

print("[+] IP %s is an Empire listener" % argv[1])

if __name__=='__main__':

main()