Azure Sentinel Solution

Last month I had a webinar about “Azure security and consideration, where one of my slides I mentioned about Azure Sentinel solution which I wanted to cover more in depth here in this blogpost.  Now the basic thing you need to understand is that Sentinel is a module/solution which runs on top of Log Analytics.
Log Analytics as a standalone component is used by a lot of other services in Azure as well, just to give an example. Log Analytics can be used in combination with Azure Monitor, Network Watcher, Azure Automation, Application Insight, Diagnostics Logs, Application logs and so on.
Overview of service usage and pricing


Microsoft always recommends that you have a few workspaces as possible, why? Well let us say that we have Log Analytics installed on a virtual machine where we want to collect security events from. (At the same time we also want to collect performance me07ics for monitoring purposes. For Windows multihoming is possible, but you cannot define which data goes where so you are essentially paying double. For other log sources such as Linux and PaaS services you cannot define multihoming and all the data will go to one workspace. Therefore you should atleast 1 workspace per region, or else logging will becoming difficult and also especially when you combine that with Sentinel.
Now the next problem that comes up is that when you setup a log analytics workspace you also need to decide on a retention time. By default it is 30 days (90 days with Sentinel) but if you also collect performance me07ics you are also storing that data for 90 days as well, Performance data is not something you need to have stored for 90 days, just the security logs for instance.

Luckily Microsoft last month announced table based retention –> 
choco install armclient --source=https://chocolatey.org/api/v2/
Using this command you can list out all the tables within a workplace
armclient get “/subscriptions/subscriptionid/resourceGroups/resourcegroup/providers/Microsoft.OperationalInsights/workspaces/workspacename/Tables?api-version=2017-04-26-preview”
 So finally this allows expert like us to define custom retention per table. So performance me07ics can be stored in 30 days and we can have security events stored up to 1 year for instance.
Also one thing that you should remember is that Azure Monitor has a default ingestion rate threshold is set to 500 MB/min per workspace. If you send data at a higher rate to a single workspace, some data is dropped.  You can create a custom alert rule if you see this error
Operation |where OperationCategory == "Ingestion" | where Detail startswith "The rate of data crossed the threshold"

Role based Access Con07ol

Previously there has been limited options to define proper role based access con07ol to Log Analytics Workspaces. You could only limit based upon RBAC in Azure which has previously been full access to read data or no access at all. Doesn’t work well if you have a workspace that is used by multiple teams and security team using it for Sentinel and hunting purposes, you don’t want to have the app team access to security logs?
First off a log analytics workspace has two access models that defined
·         Workspace Permissions (Default for Workspaces Created before March 2019)
·         Resource or Workspace Permissions (Default for Workspaces Created after March 2019)
With Resource based access you can give users only access to certain tables within a Log Analytics Workspace. This means that you can just read access to the resource. Permissions can be inherited (such as from the containing resource group) or directly assigned to the resource. Permission to the logs for the resource will be automatically assigned for the users to the log workspace.
For Sentinel there is also some built-in roles that can be used
·         Azure Sentinel reader: A user assigned with this role has viewing rights to Azure Sentinel. The user can view incidents and data but cannot make changes.
·         Azure Sentinel responder: A user assigned with this role can read and perform actions on incidents such as assignment and severity changes.
·         Azure Sentinel con07ibutor: A user assigned with this role can read and perform actions on incidents and create and delete analytic rules.
These roles can only be assigned on the workspace level.  Below table define roles classification.

If you want to define a custom RBAC Policy for Table specific access you need to define a custom role, according to the setup here –> https://docs.microsoft.com/en-us/azure/role-based-access-con07ol/custom-roles
NOTE: Global permissions override any table based RBAC. Also If a user is granted per-table access but no other permissions, they would be able to access log data from the API but not from the Azure portal. To provide access from the Azure portal, use Log Analytics Reader as its base role.

 

MSP

For managed service providers you would essentially need to leverage Azure Lighthouse (Delegated Access) where you have multiple Sentinel & Log Analytics instances, each defined within customer subscription so you can pull the data from the different sources that they have. Where you configure the different Log Analytics Roles as part of Lighthouse Delegation.

However there is currently some limitations when it comes to an MSP approach.
·         All Rules and logic defined within each workspace (Which means that rules created in one tenant are not directly applicable to the other tenants, same goes for automation jobs that are created.)
·         No way to search across multiple tenants (You need to go into each tenant and then do hunting)
·         Cost still going directly to subscription owner, since it is still not possible to host the Log Analytics workspace in an MSP tenant and connect the data sources from customer tenant. So therefore all Sentinel and Log Analytics will be going directly to the customer.
Having this approach also allows you to customize modules and data sources for each tenant. Because in most cases what kind of data sources they have on their end will differ from tenant to tenant.
However, you can still send all your tenant based Sentinel instances directly to your own ITSM tool, using Logic Apps automation. Since you can essentially just recreate the Logic App to point alerts created in customer Sentinel instances back at your ITSM tool to open an incident or such.
And FYI just some quick comments about pricing.
·         Sentinel pricing is based upon data analyzed not ingested
·         The more data that is in the datasets defined in a hunting query the higher the cost will be
·         Use timefilter or scoping queries to ensure that you can con07ol cost
·         Some of the predefined queries have date limits defined but not all!
·         Still unsure if regular Log Analytics Search Queries will affect the cost.
·         Pay nothing ex07a when you ingest data from Office 365 audit logs, Azure activity logs, and alerts from Microsoft threat protection solutions.
Also if you need to remove en07ies or log files in the log set this can only be done using the REST API Data Purge Command.
·         Microsoft.OperationalInsights/workspaces/{workspaceName}/purge?api-version=2015-03-20
·         Data Purger Role Required or higher
Let me know your thought and question, if any help require feel free to contact us.

Comments

Popular posts from this blog

Getting Started with Logic Apps - XML to EDI X12

The request has both SAS authentication scheme and 'Bearer' authorization scheme. Only one scheme should be used

Getting Started with Logic Apps - AS2