CARVIEW |
Securing Splunk Enterprise
- Install Splunk Enterprise securely
- Create secure administrator credentials
- About TLS encryption and cipher suites
- Secure Splunk Enterprise with FIPS
- About default certificate authentication
- Harden the Splunk Enterprise installation directory on Windows
- Secure Splunk Enterprise on your network
- Disable unnecessary Splunk Enterprise components
- Secure Splunk Enterprise service accounts
- Deploy secure passwords across multiple servers
- Harden the network port that App Key Value Store uses
- Best practices for hardening Splunk Enterprise servers and the operating systems they use
- Use network access control lists to protect your deployment
- Use access control to secure Splunk data
- About user authentication
- About configuring role-based user access
- Define roles on the Splunk platform with capabilities
- Create and manage users with Splunk Web
- Create and manage roles with Splunk Web
- Find existing users and roles
- Secure access for Splunk knowledge objects
- Password best practices for administrators
- Configure Splunk password policies
- Configure a Splunk Enterprise password policy using the Authentication.conf configuration file
- Password best practices for users
- Unlock a user account
- Change a user password
- Manage out-of-sync passwords in a search head cluster
- Set up user authentication with LDAP
- Manage Splunk user roles with LDAP
- LDAP prerequisites and considerations
- Secure LDAP authentication with transport layer security (TLS) certificates
- How the Splunk platform works with multiple LDAP servers for authentication
- Configure LDAP with Splunk Web
- Map LDAP groups to Splunk roles in Splunk Web
- Configure single sign-on with SAML
- Configure SSO with PingIdentity as your SAML identity provider
- Configure SSO with Okta as your identity provider
- Configure SSO with Microsoft Azure AD or AD FS as your Identity Provider
- Configure SSO with OneLogin as your identity provider
- Configure SSO with Optimal as your identity provider
- Configure SSO in Computer Associates (CA) SiteMinder
- Secure SSO with TLS certificates on Splunk Enterprise
- Configure Ping Identity with leaf or intermediate SSL certificate chains
- Configure SAML SSO for other IdPs
- Configure authentication extensions to interface with your SAML identity provider
- Configure advanced settings for SSO
- Map groups on a SAML identity provider to Splunk roles
- Modify or remove role mappings
- Troubleshoot SAML SSO
- About multifactor authentication with Duo Security
- Configure Splunk Enterprise to use Duo Security multifactor authentication
- Configure Duo multifactor authentication for Splunk Enterprise in the configuration file
- About multifactor authentication with RSA Authentication Manager
- Configure RSA authentication from Splunk Web
- Configure Splunk Enterprise to use RSA Authentication Manager multifactor authentication via the REST endpoint
- Configure Splunk Enterprise to use RSA Authentication Manager multifactor authentication in the configuration file
- User experience when logging into a Splunk instance configured with RSA multifactor authentication
- About securing inter-Splunk communication
- Configure secure communications between Splunk instances with updated cipher suite and message authentication code
- Securing distributed search heads and peers
- Secure deployment servers and clients using certificate authentication
- Secure Splunk Enterprise services with pass4SymmKey
- How can I integrate Identity Manager (IdM) with Sp...
- manage data input in Elasticsearch Data Integrator...
- How to install and configure SCOM - System Center ...
- Can we use the powershell/ APT for integration of ...
- Integration with SCOM
- Integration: Appdynamics with BMC event manager.
- IBM Integration Bus Agent - Unable to view transac...
- Splunk Integration with HPE Systems Insight Manage...
- [smartstore] splunk smartstore and Data integrity
- Splunk integration with ServiceNow case management...
Manage data integrity
Splunk Enterprise data integrity control helps you verify the integrity of data that it indexes.
When you enable data integrity control for an index, Splunk Enterprise computes hashes on every slice of data using the SHA-256 algorithm. It then stores those hashes so that you can verify the integrity of your data later.
Splunk Enterprise supports data integrity control on local indexes only. There is no support on SmartStore indexes.
Data integrity control works only on Splunk Enterprise. Splunk Cloud Platform does not use data integrity control.
How data verification works
When you enable data integrity control, Splunk Enterprise computes hashes on every slice of newly indexed raw data and writes it to an l1Hashes
file. When the bucket rolls from hot to warm, Splunk Enterprise computes a hash on the contents of the l1Hashes
and stores the computed hash in l2Hash
. Splunk Enterprise stores both hash files in the rawdata
directory for that bucket.
Data integrity control generates hashes on newly indexed data. To ensure that data that comes from a. forwarder is secure, encrypt that data using SSL. For more information, see About securing Splunk with SSL.
Check data verification hashes to validate data
To check Splunk Enterprise data, run the following CLI command to verify the integrity of an index or bucket:
./splunk check-integrity -bucketPath [ bucket path ] [ -verbose ] ./splunk check-integrity -index [ index name ] [ -verbose ]
Configure data integrity control
To configure Data Integrity Control, edit the indexes.conf
configuration file. For each index that you want data integrity to enable the enableDataIntegrityControl
setting for each index. The default value for this setting for all indexes is false
(off).
enableDataIntegrityControl=true
Data Integrity in clustered environments
In a clustered environment, the cluster manager and all the peers must run Splunk Enterprise 6.3 or higher to enable accurate index replication.
Optionally modify the size of your data slice
By default, data slice sizes are set to 128KB, which means that a data slice is created and hashed every 128KB. You can optionally edit the indexes.conf
configuration file to specify the size of each slice.
rawChunkSizeBytes = 131072
Store and secure data hashes
For optimal security, you can optionally store data integrity hashes outside the instance that hosts your data, such as on a different server. To avoid naming conflicts, store secured hashes in separate directories.
Regenerate hashes
If you lose data hashes for a bucket, use the following CLI command to regenerate the files on a bucket or index. This command extracts the hashes that exist in the journal:
./splunk generate-hash-files -bucketPath [ bucket path ] [ -verbose ] ./splunk generate-hash-files -index [ index name ] [ -verbose ]
Use audit events to secure Splunk Enterprise | Safeguards for risky commands |
This documentation applies to the following versions of Splunk® Enterprise: 7.0.0, 7.0.1, 7.0.2, 7.0.3, 7.0.4, 7.0.5, 7.0.6, 7.0.7, 7.0.8, 7.0.9, 7.0.10, 7.0.11, 7.0.13, 7.1.0, 7.1.1, 7.1.2, 7.1.3, 7.1.4, 7.1.5, 7.1.6, 7.1.7, 7.1.8, 7.1.9, 7.1.10, 7.2.0, 7.2.1, 7.2.2, 7.2.3, 7.2.4, 7.2.5, 7.2.6, 7.2.7, 7.2.8, 7.2.9, 7.2.10, 7.3.0, 7.3.1, 7.3.2, 7.3.3, 7.3.4, 7.3.5, 7.3.6, 7.3.7, 7.3.8, 7.3.9, 8.0.0, 8.0.1, 8.0.2, 8.0.3, 8.0.4, 8.0.5, 8.0.6, 8.0.7, 8.0.8, 8.0.9, 8.0.10, 8.1.0, 8.1.1, 8.1.2, 8.1.3, 8.1.4, 8.1.5, 8.1.6, 8.1.7, 8.1.8, 8.1.9, 8.1.10, 8.1.11, 8.1.12, 8.1.13, 8.1.14, 8.2.0, 8.2.1, 8.2.2, 8.2.3, 8.2.4, 8.2.5, 8.2.6, 8.2.7, 8.2.8, 8.2.9, 8.2.10, 8.2.11, 8.2.12, 9.0.0, 9.0.1, 9.0.2, 9.0.3, 9.0.4, 9.0.5, 9.0.6, 9.0.7, 9.0.8, 9.0.9, 9.0.10, 9.1.0, 9.1.1, 9.1.2, 9.1.3, 9.1.4, 9.1.5, 9.1.6, 9.1.7, 9.1.8, 9.1.9, 9.2.0, 9.2.1, 9.2.2, 9.2.3, 9.2.4, 9.2.5, 9.2.6, 9.3.0, 9.3.1, 9.3.2, 9.3.3, 9.3.4, 9.4.0, 9.4.1, 9.4.2
Comments
You must be logged into splunk.com in order to post comments. Log in now.
Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.
Your Comment Has Been Posted Above
Feedback submitted, thanks!