Mitigation Strategy: Strict Input Validation and Sanitization Before Logging
-
Description:
- Identify all logging points: Find every instance where
$logger->{level}()
is called in your code. - Analyze input sources: For each logging point, determine where the data being logged originates (user input, API responses, database results, etc.).
- Implement type-specific sanitization: Before passing any data to the logger, apply appropriate sanitization:
- URLs:
filter_var($url, FILTER_SANITIZE_URL)
- Emails:
filter_var($email, FILTER_SANITIZE_EMAIL)
- Usernames/Text: Control character removal/replacement, length limits, whitelists, and HTML-escaping only if the log viewer renders HTML.
- Numbers:
filter_var($number, FILTER_SANITIZE_NUMBER_INT)
orFILTER_SANITIZE_NUMBER_FLOAT
.
- URLs:
- Centralize sanitization: Create reusable functions or a class to handle sanitization, avoiding code duplication.
- Use the context array: Pass sanitized data within the context array, not directly concatenated into the message string.
- Unit test: Verify sanitization functions with various inputs, including malicious ones.
- Identify all logging points: Find every instance where
-
Threats Mitigated:
- Log Forging (High Severity): Prevents injection of newline characters or control characters to create fake log entries.
- Cross-Site Scripting (XSS) (High Severity): Prevents injection of JavaScript if the log viewer renders HTML.
- Code Injection (High Severity): Mitigates potential code injection in vulnerable log processing tools.
- Data Leakage (Medium Severity): Helps prevent logging of malformed data that might reveal internal information.
-
Impact:
- Log Forging: Risk reduced from High to Low.
- XSS: Risk reduced from High to Low (if applicable).
- Code Injection: Risk reduced from High to Low (in specific cases).
- Data Leakage: Risk reduced from Medium to Low.
-
Currently Implemented:
- Partial sanitization in
UserController
andApiRequestLogger
.
- Partial sanitization in
-
Missing Implementation:
- Consistent sanitization across all logging points.
- Dedicated sanitization functions/classes.
- Comprehensive unit tests for sanitization.
- Consistent use of the context array.
Mitigation Strategy: Contextual Data Handling and Masking
-
Description:
- Identify sensitive data: List all data considered sensitive (PII, credentials, API keys, etc.).
- Choose masking methods:
- Redaction: Replace with
[REDACTED]
or similar. - Hashing: Use a strong cryptographic hash (e.g., SHA-256) for correlation without revealing the original data.
- Partial Masking: Reveal only a portion (e.g., last four digits).
- Redaction: Replace with
- Implement masking before logging:
- Create a masking function/class: This function should handle different data types.
- Integrate into logging calls: Call the masking function before passing data to the logger.
- Prioritize the context array: Pass masked data within the context array. This is crucial for structured logging.
- Unit test: Verify masking functions correctly handle all sensitive data types.
-
Threats Mitigated:
- Data Breach (Critical Severity): Prevents exposure of sensitive data in logs.
- Privacy Violation (High Severity): Protects user privacy.
- Compliance Violations (High Severity): Helps meet data protection regulations.
- Reputational Damage (High Severity): Reduces risk of negative publicity.
-
Impact:
- Data Breach: Risk reduced from Critical to Low.
- Privacy Violation: Risk reduced from High to Low.
- Compliance Violations: Risk reduced from High to Low.
- Reputational Damage: Risk reduced from High to Low.
-
Currently Implemented:
- Partial masking of API keys in
ApiRequestLogger
.
- Partial masking of API keys in
-
Missing Implementation:
- Consistent masking strategy across the application.
- Masking of other sensitive data types (session tokens, passwords).
- Dedicated masking function/class.
- Consistent use of the context array for masked data.
- Unit tests for masking.
Mitigation Strategy: Structured Logging (JSON) with Context
-
Description:
- Use a JSON-supporting logger: Choose a PSR-3 logger like Monolog with JSON formatting capabilities.
- Configure JSON formatter: Set up the logger to use a
JsonFormatter
. - Define a log schema: Establish consistent fields for all log entries (timestamp, level, message, context, etc.).
- Always use the context array: Pass all data (except the main message string) as key-value pairs in the context array. This is the core of structured logging.
- Combine with sanitization and masking: Ensure data passed in the context array is already sanitized and masked as needed.
- Test JSON output: Verify log entries are valid JSON and adhere to the defined schema.
-
Threats Mitigated:
- Difficult Log Analysis (Medium Severity): Enables easier parsing, searching, and analysis.
- Log Injection (Medium Severity): JSON encoding helps mitigate some injection attacks.
- Inefficient Log Processing (Low Severity): Structured logs are more efficient to process.
-
Impact:
- Difficult Log Analysis: Risk reduced from Medium to Low.
- Log Injection: Risk reduced from Medium to Low.
- Inefficient Log Processing: Risk reduced from Low to Negligible.
-
Currently Implemented:
- None.
-
Missing Implementation:
- Complete migration to structured logging with JSON and consistent context array usage.
Mitigation Strategy: Log Level Management
-
Description:
- Use PSR-3 levels correctly:
debug
andinfo
should never contain sensitive information. Usewarning
orerror
as the default production level. - Avoid verbose logging in production: Minimize the use of
debug
andinfo
in production environments. - Control log level dynamically: Use environment variables, configuration files, or a runtime API to adjust the log level without redeployment. This allows for temporary increases in verbosity for troubleshooting.
- Never log sensitive information on debug or info levels.
- Use PSR-3 levels correctly:
-
Threats Mitigated:
- Excessive Logging (Medium Severity): Reduces log volume.
- Data Leakage (Medium Severity): Reduces risk of sensitive data in lower-level logs.
- Performance Degradation (Low Severity): Excessive logging can impact performance.
- Disk Space Exhaustion (Low Severity): Excessive logging can fill disk space.
-
Impact:
- Excessive Logging: Risk reduced from Medium to Low.
- Data Leakage: Risk reduced from Medium to Low.
- Performance Degradation: Risk reduced from Low to Negligible.
- Disk Space Exhaustion: Risk reduced from Low to Negligible.
-
Currently Implemented:
- Uses different log levels, but
info
is the default in production.
- Uses different log levels, but
-
Missing Implementation:
- Dynamic log level configuration.
- Strict adherence to not logging sensitive data at lower levels.