{"id":57371,"date":"2020-03-30T15:43:43","date_gmt":"2020-03-30T14:43:43","guid":{"rendered":"http:\/\/content.n4stack.io\/?p=57371"},"modified":"2020-04-30T12:24:31","modified_gmt":"2020-04-30T11:24:31","slug":"azure-sentinel-network-ids","status":"publish","type":"post","link":"http:\/\/content.n4stack.io\/2020\/03\/30\/azure-sentinel-network-ids\/","title":{"rendered":"Network IDS & Azure Sentinel"},"content":{"rendered":"

[et_pb_section fb_built=”1″ _builder_version=”4.0.11″][et_pb_row _builder_version=”4.0.11″][et_pb_column type=”4_4″ _builder_version=”4.0.11″][et_pb_text _builder_version=”4.3.4″ header_3_text_color=”#e05206″ header_5_text_color=”#00a9e0″]<\/p>\n

I’ve been starting to use Azure Sentinel<\/a><\/span><\/span>\u00a0recently and explore some of its capabilities – there are currently about 40 built-in data-connectors that take logs from different services\/products.<\/span><\/span>\u00a0<\/span><\/p>\n

 <\/p>\n

\"\"<\/p>\n

 <\/p>\n

I decided to see if I could add integrations with some open-source network tools and\u00a0<\/span><\/span>Zeek<\/span><\/span><\/a> (formerly Bro) seemed like a perfect place to start. Rather than logging packets that match a specific rule (as is the focus of Snort\/Suricata),\u00a0<\/span><\/span>Zeek<\/span><\/span>\u00a0can be configured to log pretty much anything, out-of-the-box it logs metadata on all SSL connections, DNS lookups, HTTP requests etc.<\/span><\/span>\u00a0<\/span><\/p>\n

 <\/p>\n

\"\"<\/p>\n

 <\/p>\n

I won’t go through the basic setup for\u00a0<\/span><\/span>Zeek<\/span><\/span>\u00a0since that’s much better documented elsewhere<\/a><\/span><\/span>,<\/span><\/span>\u00a0suffice to say I installed Debian 10 on a small physical box and then installed\u00a0<\/span><\/span>Zeek<\/span><\/span>\u00a0via Apt. I then installed the Azure OMS agent which collects logs and sends them into Azure.<\/span><\/span>\u00a0<\/span><\/p>\n

 <\/p>\n

\"\"<\/p>\n

 <\/p>\n

At this point I began to run into issues, logs weren’t being ingested into Log-Analytics and the OMS agent logs showed:<\/span><\/span><\/p>\n

Stream error in the HTTP\/2 framing layer. Make sure Curl<\/span><\/span>\u00a0and SSL libraries are up to date.<\/span><\/span><\/em><\/p>\n

Some testing with Curl led me to the following issue<\/a> w<\/span>hich was fixed in a newer version on Curl than the one available in the stable release of Debian.<\/span>\u00a0<\/span><\/p>\n

After some messing about with installing testing versions of curl\/libc6 (and breaking more things than I fixed) I checked the Curl version in latest version of Ubuntu, found that it was newer and nuked the Debian install in favour of ‘<\/span>Eoan<\/span>\u00a0Ermine’. I’m confident that someone with more Linux skills could have resolved this but I just wanted to get it working!<\/span>\u00a0<\/span><\/p>\n

So<\/span>\u00a0I’ve now got a machine logging network events into \/opt\/<\/span>zeek<\/span>\/logs\/current\/ and a connector that can ship logs into an Azure Log Analytics workspace, just need to get those 2 connected together! In the Advanced Settings of the log-analytics workspace there’s a blade for Custom Logs.<\/span><\/p>\n

 <\/p>\n

\"\"<\/p>\n

 <\/p>\n

In order to use\u00a0<\/span><\/span>this<\/span><\/span> we first need to grab a copy of a sample log file from the Z<\/span><\/span>eek<\/span><\/span> logs directory – I’ll start with DNS as it’s a really great source of data. I removed the first 8 lines from the start of the file so that the first line is the first record.<\/span><\/span><\/p>\n

 <\/p>\n

\"\"<\/p>\n

 <\/p>\n

Headers at the start of the file that you’ll want to strip out.<\/span><\/p>\n

It’s useful to copy the list of fields to another file to reference later. I then informed azure that the logs entries were separated by line-breaks.<\/span><\/p>\n

 <\/p>\n

\"\"<\/p>\n

 <\/p>\n

Note there’s no need to define individual fields at this point – each log entry will be stored as a string and analysis happens when they are queried – this means even if the log format changes in future, the data will still all be stored.<\/span>\u00a0<\/span><\/p>\n

The final steps are to tell the agent where it should look for this log file and to give it a snappy name.<\/span><\/p>\n

 <\/p>\n

\"\"<\/p>\n

 <\/p>\n

Once you’ve added the custom log and saved the configuration it can take up to 20 min before you start seeing the new log entries in Sentinel, this is a good opportunity to go through a few more log types and get them set up.<\/span>\u00a0<\/span><\/p>\n

When you are getting some data into Sentinel, it’s time to start parsing out some useful fields from the logs, since the\u00a0<\/span>Zeek<\/span> logs are tab delimited, the easiest way to handle these with KQL is to split the string into a dynamic field using the split function and then create individual fields from there. <\/span>\u00a0<\/span><\/p>\n

 <\/p>\n

\"\"<\/p>\n

 <\/p>\n

I recommend saving the parser for each type of log as a function, this means you can then use it to write simpler queries later.<\/span><\/span><\/p>\n

 <\/p>\n

\"\"<\/p>\n

 <\/p>\n

It’s also useful to remember that the connection ID is consistent across the different log files, this means you can join various tables on that field.<\/span><\/p>\n

Some ideas for using DNS logs in Sentinel:<\/span>\u00a0<\/span><\/p>\n