[TEST-Copilot] Cryptographic Validation For Authenticated Variables#343
[TEST-Copilot] Cryptographic Validation For Authenticated Variables#343
Conversation
…validation - Renamed script to validate_kek.py to reflect broader functionality - Added validate_single_kek() function to validate individual KEK files - Updated main() to detect and handle both file and folder inputs - Improved output messages: 'Cryptographic Signature: VALID/INVALID' and 'Expected Payload: True/False' - JSON output always generated for CI integration - Preserves git history through git mv
- Created validate-kek-updates.yml workflow that runs on PRs - Triggers when .bin files in KEK directories are added or modified - Validates cryptographic signatures and payloads for all changed KEK files - Fails CI if any KEK has an invalid signature - Warns (but doesn't fail) if payload doesn't match expected hash - Uploads validation results as artifacts - Adds detailed summary to PR with pass/fail status for each file
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
81658c8 to
119aa64
Compare
There was a problem hiding this comment.
Pull request overview
This pull request introduces cryptographic signature verification capabilities for authenticated UEFI variables, specifically targeting KEK (Key Exchange Key) update files. The changes add automation to validate KEK files in pull requests and enhance the auth_var_tool.py script with new verification and improved output features.
Changes:
- Added a GitHub Actions workflow to automatically validate KEK update files in pull requests, checking for valid cryptographic signatures and expected payloads
- Introduced a new
verifycommand in auth_var_tool.py that validates PKCS7 signatures of authenticated variables - Enhanced the
describecommand to convert hex-encoded ASN.1 strings to human-readable text - Created validate_kek.py script for batch validation of KEK files with JSON output
Reviewed changes
Copilot reviewed 3 out of 3 changed files in this pull request and generated 13 comments.
| File | Description |
|---|---|
.github/workflows/validate-kek-updates.yml |
New GitHub Actions workflow that validates KEK files in PRs, generates validation reports, and comments on PRs when validation fails |
scripts/validate_kek.py |
New validation script that checks KEK files for valid signatures and expected payloads, supporting both single file and folder validation |
scripts/auth_var_tool.py |
Added verify command for cryptographic signature validation, improved describe command output with hex-to-readable conversion, and added supporting PKCS7 verification functions |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
|
|
||
| # Validate the file using auth_var_tool.verify_variable | ||
| # Create a namespace object with the required arguments | ||
| import argparse |
There was a problem hiding this comment.
The argparse module is already imported at the top of the file (line 7). This redundant import inside the function should be removed to follow Python best practices and maintain code cleanliness.
| import argparse |
|
|
||
| # Validate the file using auth_var_tool.verify_variable | ||
| # Create a namespace object with the required arguments | ||
| import argparse |
There was a problem hiding this comment.
The argparse module is already imported at the top of the file (line 7). This redundant import inside the function should be removed to follow Python best practices and maintain code cleanliness.
| import argparse |
| str | ||
| The content with hex strings converted to readable format where possible | ||
| """ | ||
| import re |
There was a problem hiding this comment.
The re module is already imported at the top of the file (line 49). This redundant import inside the function should be removed to follow Python best practices and maintain code cleanliness.
| import re |
| # Second byte is the length | ||
| length = hex_bytes[1] | ||
| if len(hex_bytes) >= 2 + length: | ||
| # Extract the string content | ||
| string_data = hex_bytes[2:2+length] |
There was a problem hiding this comment.
The ASN.1 length decoding assumes a short form length encoding (single byte, line 686). ASN.1 also supports long form length encoding where the first byte indicates how many subsequent bytes encode the length (if bit 7 is set). While short form is common for small strings in certificates, this could fail to decode longer strings correctly. Consider adding support for long form length encoding or documenting this limitation.
| # Second byte is the length | |
| length = hex_bytes[1] | |
| if len(hex_bytes) >= 2 + length: | |
| # Extract the string content | |
| string_data = hex_bytes[2:2+length] | |
| # Decode ASN.1 length (support short and long form) | |
| first_len = hex_bytes[1] | |
| if first_len & 0x80 == 0: | |
| # Short form: length is in the first length octet | |
| length = first_len | |
| content_offset = 2 | |
| else: | |
| # Long form: lower 7 bits indicate number of subsequent length octets | |
| num_len_bytes = first_len & 0x7F | |
| if num_len_bytes == 0: | |
| # Indefinite form is not expected in DER; treat as decode failure | |
| raise ValueError("Indefinite length encoding not supported") | |
| if len(hex_bytes) < 2 + num_len_bytes: | |
| raise ValueError("Insufficient bytes for ASN.1 length") | |
| length_bytes = hex_bytes[2:2 + num_len_bytes] | |
| length = int.from_bytes(length_bytes, byteorder="big") | |
| content_offset = 2 + num_len_bytes | |
| if len(hex_bytes) >= content_offset + length: | |
| # Extract the string content | |
| string_data = hex_bytes[content_offset:content_offset + length] |
| """Validate KEK update file(s) and generate a JSON report. | ||
|
|
||
| This script validates authenticated variable files - either a single file or all | ||
| files in a specified folder - and generates a JSON report with validation results. | ||
| """ | ||
|
|
||
| import argparse | ||
| import hashlib | ||
| import json | ||
| import logging | ||
| import sys | ||
| from datetime import datetime, timezone | ||
| from pathlib import Path | ||
|
|
||
| # Import validation functions from auth_var_tool | ||
| sys.path.insert(0, str(Path(__file__).parent)) | ||
| # Import the verify function from auth_var_tool | ||
| from auth_var_tool import verify_variable | ||
| from edk2toollib.uefi.authenticated_variables_structure_support import EfiVariableAuthentication2 | ||
|
|
||
| # Standard KEK parameters | ||
| KEK_NAME = "KEK" | ||
| KEK_GUID = "8be4df61-93ca-11d2-aa0d-00e098032b8c" | ||
| KEK_ATTRIBUTES = "NV,BS,RT,AT,AP" | ||
|
|
||
| # Expected payload hash for Microsoft 2023 KEK (EFI Signature List with x.509) | ||
| EXPECTED_PAYLOAD_HASH = "5b85333c009d7ea55cbb6f11a5c2ff45ee1091a968504c929aed25c84674962f" | ||
|
|
||
|
|
||
| def validate_single_kek( | ||
| kek_file: Path, | ||
| quiet: bool = False | ||
| ) -> dict: | ||
| """Validate a single KEK update file. | ||
|
|
||
| Args: | ||
| kek_file: Path to KEK update file | ||
| quiet: If True, suppress validation output from the prototype | ||
|
|
||
| Returns: | ||
| dict: Validation result for the file | ||
| """ | ||
| logging.info(f"Validating: {kek_file.name}") | ||
|
|
||
| file_result = { | ||
| "filename": kek_file.name, | ||
| "path": str(kek_file), | ||
| "valid": False, | ||
| "payload_hash_valid": False, | ||
| "error": None, | ||
| "warnings": [], | ||
| "details": {} | ||
| } | ||
|
|
||
| try: | ||
| # First, parse the authenticated variable to check payload hash | ||
| with open(kek_file, 'rb') as f: | ||
| auth_var = EfiVariableAuthentication2(decodefs=f) | ||
| payload = auth_var.payload | ||
| payload_hash = hashlib.sha256(payload).hexdigest() | ||
|
|
||
| file_result["payload_hash"] = payload_hash | ||
| file_result["payload_size"] = len(payload) | ||
| file_result["payload_hash_valid"] = (payload_hash.lower() == EXPECTED_PAYLOAD_HASH.lower()) | ||
|
|
||
| if not file_result["payload_hash_valid"]: | ||
| warning_msg = f"Payload hash mismatch: expected {EXPECTED_PAYLOAD_HASH}, got {payload_hash}" | ||
| file_result["warnings"].append(warning_msg) | ||
| logging.warning(" [!] Payload hash mismatch!") | ||
| logging.warning(f" Expected: {EXPECTED_PAYLOAD_HASH}") | ||
| logging.warning(f" Got: {payload_hash}") | ||
|
|
||
| # Validate the file using auth_var_tool.verify_variable | ||
| # Create a namespace object with the required arguments | ||
| import argparse | ||
| verify_args = argparse.Namespace( | ||
| authvar_file=str(kek_file), | ||
| var_name=KEK_NAME, | ||
| var_guid=KEK_GUID, | ||
| attributes=KEK_ATTRIBUTES, | ||
| verbose=False | ||
| ) | ||
|
|
||
| # Capture logger output if in quiet mode | ||
| if quiet: | ||
| # Temporarily increase logger level to suppress INFO messages | ||
| original_level = logging.root.level | ||
| logging.root.setLevel(logging.ERROR) | ||
|
|
||
| try: | ||
| # verify_variable returns 0 for success, 1 for failure | ||
| exit_code = verify_variable(verify_args) | ||
| file_result["valid"] = (exit_code == 0) | ||
|
|
||
| if not file_result["valid"]: | ||
| file_result["warnings"].append("Signature verification failed") | ||
| finally: | ||
| if quiet: | ||
| # Restore original logger level | ||
| logging.root.setLevel(original_level) | ||
|
|
||
| # Store basic details | ||
| file_result["details"] = { | ||
| "verified": file_result["valid"] | ||
| } | ||
|
|
||
| # Display results | ||
| sig_status = "VALID" if file_result["valid"] else "INVALID" | ||
| payload_status = "True" if file_result["payload_hash_valid"] else "False" | ||
|
|
||
| logging.info(f" Cryptographic Signature: {sig_status}") | ||
| logging.info(f" Expected Payload: {payload_status}\n") | ||
|
|
||
| except Exception as e: | ||
| file_result["error"] = str(e) | ||
| logging.error(f" [X] ERROR: {e}\n") | ||
|
|
||
| return file_result | ||
|
|
||
|
|
||
| def validate_kek_folder( | ||
| folder_path: Path, | ||
| output_file: Path = None, | ||
| quiet: bool = False, | ||
| recursive: bool = False | ||
| ) -> dict: | ||
| """Validate all .bin files in the specified folder. | ||
|
|
||
| Args: | ||
| folder_path: Path to folder containing KEK update files | ||
| output_file: Optional path to output JSON file | ||
| quiet: If True, suppress validation output from the prototype | ||
| recursive: If True, process subdirectories recursively | ||
|
|
||
| Returns: | ||
| dict: Validation results | ||
| """ | ||
| results = { | ||
| "validation_date": datetime.now(timezone.utc).isoformat(), | ||
| "folder": str(folder_path), | ||
| "parameters": { | ||
| "var_name": KEK_NAME, | ||
| "var_guid": KEK_GUID, | ||
| "attributes": KEK_ATTRIBUTES | ||
| }, | ||
| "files": {}, | ||
| "by_manufacturer": {} | ||
| } | ||
|
|
||
| # Find all .bin files (recursively if requested) | ||
| if recursive: | ||
| bin_files = sorted(folder_path.rglob("*.bin")) | ||
| else: | ||
| bin_files = sorted(folder_path.glob("*.bin")) | ||
|
|
||
| if not bin_files: | ||
| logging.warning(f"No .bin files found in {folder_path}") | ||
| # Initialize empty summary for consistency | ||
| results["summary"] = { | ||
| "total": 0, | ||
| "valid": 0, | ||
| "invalid": 0, | ||
| "manufacturers": 0 | ||
| } | ||
| return results | ||
|
|
||
| logging.info(f"Found {len(bin_files)} files to validate\n") | ||
|
|
||
| # Validate each file | ||
| for bin_file in bin_files: | ||
| # Determine manufacturer (relative path from base folder) | ||
| relative_path = bin_file.relative_to(folder_path) | ||
| if len(relative_path.parts) > 1: | ||
| manufacturer = relative_path.parts[0] | ||
| else: | ||
| manufacturer = "root" | ||
|
|
||
| logging.info(f"Validating: {relative_path}") | ||
|
|
||
| file_result = { | ||
| "filename": bin_file.name, | ||
| "relative_path": str(relative_path), | ||
| "manufacturer": manufacturer, | ||
| "path": str(bin_file), | ||
| "valid": False, | ||
| "payload_hash_valid": False, | ||
| "error": None, | ||
| "warnings": [], | ||
| "details": {} | ||
| } | ||
|
|
||
| try: | ||
| # First, parse the authenticated variable to check payload hash | ||
| with open(bin_file, 'rb') as f: | ||
| auth_var = EfiVariableAuthentication2(decodefs=f) | ||
| payload = auth_var.payload | ||
| payload_hash = hashlib.sha256(payload).hexdigest() | ||
|
|
||
| file_result["payload_hash"] = payload_hash | ||
| file_result["payload_size"] = len(payload) | ||
| file_result["payload_hash_valid"] = (payload_hash.lower() == EXPECTED_PAYLOAD_HASH.lower()) | ||
|
|
||
| if not file_result["payload_hash_valid"]: | ||
| warning_msg = f"Payload hash mismatch: expected {EXPECTED_PAYLOAD_HASH}, got {payload_hash}" | ||
| file_result["warnings"].append(warning_msg) | ||
| logging.warning(" [!] Payload hash mismatch!") | ||
| logging.warning(f" Expected: {EXPECTED_PAYLOAD_HASH}") | ||
| logging.warning(f" Got: {payload_hash}") | ||
|
|
||
| # Validate the file using auth_var_tool.verify_variable | ||
| # Create a namespace object with the required arguments | ||
| import argparse | ||
| verify_args = argparse.Namespace( | ||
| authvar_file=str(bin_file), | ||
| var_name=KEK_NAME, | ||
| var_guid=KEK_GUID, | ||
| attributes=KEK_ATTRIBUTES, | ||
| verbose=False | ||
| ) | ||
|
|
||
| # Capture logger output if in quiet mode | ||
| if quiet: | ||
| # Temporarily increase logger level to suppress INFO messages | ||
| original_level = logging.root.level | ||
| logging.root.setLevel(logging.ERROR) | ||
|
|
||
| try: | ||
| # verify_variable returns 0 for success, 1 for failure | ||
| exit_code = verify_variable(verify_args) | ||
| file_result["valid"] = (exit_code == 0) | ||
|
|
||
| if not file_result["valid"]: | ||
| file_result["warnings"].append("Signature verification failed") | ||
| finally: | ||
| if quiet: | ||
| # Restore original logger level | ||
| logging.root.setLevel(original_level) | ||
|
|
||
| # Store basic details | ||
| file_result["details"] = { | ||
| "verified": file_result["valid"] | ||
| } | ||
|
|
||
| # Display results | ||
| sig_status = "VALID" if file_result["valid"] else "INVALID" | ||
| payload_status = "True" if file_result["payload_hash_valid"] else "False" | ||
|
|
||
| logging.info(f" Cryptographic Signature: {sig_status}") | ||
| logging.info(f" Expected Payload: {payload_status}\n") | ||
|
|
||
| except Exception as e: | ||
| file_result["error"] = str(e) | ||
| logging.error(f" [X] ERROR: {e}") | ||
|
|
||
| results["files"][str(relative_path)] = file_result | ||
|
|
||
| # Add to manufacturer grouping | ||
| if manufacturer not in results["by_manufacturer"]: | ||
| results["by_manufacturer"][manufacturer] = { | ||
| "files": [], | ||
| "valid": 0, | ||
| "invalid": 0 | ||
| } | ||
| results["by_manufacturer"][manufacturer]["files"].append(str(relative_path)) | ||
| if file_result["valid"]: | ||
| results["by_manufacturer"][manufacturer]["valid"] += 1 | ||
| else: | ||
| results["by_manufacturer"][manufacturer]["invalid"] += 1 | ||
|
|
||
| # Generate summary | ||
| valid_count = sum(1 for r in results["files"].values() if r["valid"]) | ||
| invalid_count = len(results["files"]) - valid_count | ||
|
|
||
| results["summary"] = { | ||
| "total": len(results["files"]), | ||
| "valid": valid_count, | ||
| "invalid": invalid_count, | ||
| "manufacturers": len(results["by_manufacturer"]) | ||
| } | ||
|
|
||
| logging.info(f"\n{'='*60}") | ||
| logging.info("SUMMARY:") | ||
| logging.info(f" Total Files: {results['summary']['total']}") | ||
| logging.info(f" Valid: {results['summary']['valid']}") | ||
| logging.info(f" Invalid: {results['summary']['invalid']}") | ||
| if recursive: | ||
| logging.info(f" Manufacturers: {results['summary']['manufacturers']}") | ||
| logging.info("") | ||
| logging.info("By Manufacturer:") | ||
| for mfr, data in sorted(results["by_manufacturer"].items()): | ||
| logging.info( | ||
| f" {mfr:30s} Total: {len(data['files']):3d} Valid: {data['valid']:3d} Invalid: {data['invalid']:3d}" | ||
| ) | ||
| logging.info(f"{'='*60}") | ||
|
|
||
| # Save to file if requested | ||
| if output_file: | ||
| with open(output_file, 'w') as f: | ||
| json.dump(results, f, indent=2) | ||
| logging.info(f"\nResults saved to: {output_file}") | ||
|
|
||
| return results | ||
|
|
||
|
|
||
| def main() -> int: | ||
| """Main entry point for validating KEK update file(s).""" | ||
| parser = argparse.ArgumentParser( | ||
| description="Validate KEK update file(s) - single file or folder" | ||
| ) | ||
| parser.add_argument( | ||
| "path", | ||
| type=Path, | ||
| help="Path to a KEK update file (.bin) or folder containing KEK update files" | ||
| ) | ||
| parser.add_argument( | ||
| "-o", "--output", | ||
| type=Path, | ||
| default=None, | ||
| help="Path to output JSON file (default: <path>_validation_results.json, always generated)" | ||
| ) | ||
| parser.add_argument( | ||
| "-r", "--recursive", | ||
| action="store_true", | ||
| help="Process subdirectories recursively (only applicable for folders)" | ||
| ) | ||
| parser.add_argument( | ||
| "-v", "--verbose", | ||
| action="store_true", | ||
| help="Enable verbose logging" | ||
| ) | ||
| parser.add_argument( | ||
| "-q", "--quiet", | ||
| action="store_true", | ||
| help="Suppress validation output (show only summary)" | ||
| ) | ||
|
|
||
| args = parser.parse_args() | ||
|
|
||
| # Setup logging | ||
| log_level = logging.DEBUG if args.verbose else logging.INFO | ||
| logging.basicConfig( | ||
| level=log_level, | ||
| format='%(message)s' | ||
| ) | ||
|
|
||
| # Validate path exists | ||
| if not args.path.exists(): | ||
| logging.error(f"Path not found: {args.path}") | ||
| return 1 | ||
|
|
||
| # Determine if path is a file or directory | ||
| if args.path.is_file(): | ||
| # Validate single file | ||
| if not args.path.suffix == '.bin': | ||
| logging.error(f"File must have .bin extension: {args.path}") | ||
| return 1 | ||
|
|
||
| # Determine output file | ||
| if args.output is None: | ||
| output_file = args.path.parent / f"{args.path.stem}_validation_results.json" | ||
| else: | ||
| output_file = args.output | ||
|
|
||
| # Validate the single file | ||
| file_result = validate_single_kek(args.path, quiet=args.quiet) | ||
|
|
||
| # Create results structure | ||
| results = { | ||
| "validation_date": datetime.now(timezone.utc).isoformat(), | ||
| "file": str(args.path), | ||
| "parameters": { | ||
| "var_name": KEK_NAME, | ||
| "var_guid": KEK_GUID, | ||
| "attributes": KEK_ATTRIBUTES | ||
| }, | ||
| "result": file_result | ||
| } | ||
|
|
||
| # Save to file | ||
| with open(output_file, 'w') as f: | ||
| json.dump(results, f, indent=2) | ||
| logging.info(f"Results saved to: {output_file}") | ||
|
|
||
| # Return exit code based on validation | ||
| return 0 if file_result["valid"] else 1 | ||
|
|
||
| elif args.path.is_dir(): | ||
| # Validate folder | ||
| # Determine output file | ||
| if args.output is None: | ||
| output_file = args.path.parent / f"{args.path.name}_validation_results.json" | ||
| else: | ||
| output_file = args.output | ||
|
|
||
| # Run validation | ||
| results = validate_kek_folder(args.path, output_file, quiet=args.quiet, recursive=args.recursive) | ||
|
|
||
| # Return exit code based on results | ||
| if results["summary"]["invalid"] > 0: | ||
| return 1 | ||
|
|
||
| return 0 | ||
|
|
||
| else: | ||
| logging.error(f"Invalid path type: {args.path}") | ||
| return 1 | ||
|
|
||
|
|
||
| if __name__ == "__main__": | ||
| sys.exit(main()) |
There was a problem hiding this comment.
The repository has comprehensive test coverage for other validation scripts (e.g., test_validate_dbx_references.py), but there are no tests for validate_kek.py. Consider adding unit tests to cover the validation logic, error handling, folder scanning, and JSON output generation. Tests should verify behavior with valid files, invalid signatures, payload hash mismatches, and malformed input files.
| jobs: | ||
| validate-kek: | ||
| name: Validate KEK Update Files | ||
| runs-on: ubuntu-latest |
There was a problem hiding this comment.
The workflow uses the github-script action to comment on pull requests (lines 122-132), but doesn't explicitly set permissions. The default GITHUB_TOKEN permissions for pull_request events may not include pull-requests: write, which is required for creating comments. Consider adding explicit permissions at the job level to ensure the workflow can comment on PRs when validation fails.
| runs-on: ubuntu-latest | |
| runs-on: ubuntu-latest | |
| permissions: | |
| pull-requests: write |
| def verify_variable(args: argparse.Namespace) -> int: | ||
| """Verifies the cryptographic signature of an authenticated variable. | ||
|
|
||
| This command validates that: | ||
| 1. The PKCS7 signature structure is valid | ||
| 2. The signature cryptographically verifies against the signable data | ||
| 3. The signing certificate is present in the signature | ||
|
|
||
| Parameters | ||
| ---------- | ||
| args : argparse.Namespace | ||
| Command-line arguments including: | ||
| - authvar_file: Path to the signed authenticated variable file | ||
| - var_name: Variable name used during signing | ||
| - var_guid: Variable GUID used during signing | ||
| - attributes: Variable attributes used during signing | ||
| - verbose: Enable detailed output | ||
|
|
||
| Returns: | ||
| ------- | ||
| int | ||
| 0 if verification succeeds, 1 if verification fails | ||
| """ | ||
| try: | ||
| # Parse the authenticated variable | ||
| logger.info(f"Verifying authenticated variable: {args.authvar_file}") | ||
|
|
||
| with open(args.authvar_file, 'rb') as f: | ||
| auth_var = EfiVariableAuthentication2(decodefs=f) | ||
|
|
||
| # Reconstruct the signable data using the builder | ||
| signing_time = auth_var.time.get_datetime() | ||
| builder = EfiVariableAuthentication2Builder( | ||
| name=args.var_name, | ||
| guid=uuid.UUID(args.var_guid), | ||
| attributes=args.attributes, | ||
| payload=auth_var.payload, | ||
| efi_time=signing_time | ||
| ) | ||
| signable_data = builder.get_digest() | ||
|
|
||
| if args.verbose: | ||
| logger.info(f"Variable Name: {args.var_name}") | ||
| logger.info(f"Variable GUID: {args.var_guid}") | ||
| logger.info(f"Attributes: {args.attributes}") | ||
| logger.info(f"Signing Time: {signing_time}") | ||
| logger.info(f"Payload Size: {len(auth_var.payload)} bytes") | ||
| logger.info(f"Signable Data SHA256: {hashlib.sha256(signable_data).hexdigest()}") | ||
|
|
||
| # Extract PKCS7 signature (cert_data from edk2toollib is the PKCS7 data) | ||
| pkcs7_data = auth_var.auth_info.cert_data | ||
|
|
||
| # Extract certificates | ||
| certificates = _extract_certificates_from_pkcs7(pkcs7_data) | ||
|
|
||
| if args.verbose: | ||
| logger.info(f"\nCertificates found: {len(certificates)}") | ||
| for i, cert in enumerate(certificates, 1): | ||
| logger.info(f" Certificate {i}:") | ||
| logger.info(f" Subject: {cert.subject.rfc4514_string()}") | ||
| logger.info(f" Issuer: {cert.issuer.rfc4514_string()}") | ||
| logger.info(f" Valid: {cert.not_valid_before_utc} to {cert.not_valid_after_utc}") | ||
|
|
||
| # Verify the signature | ||
| verification_result = _verify_pkcs7_signature(pkcs7_data, certificates, signable_data) | ||
|
|
||
| # Display results | ||
| if args.verbose: | ||
| logger.info("") | ||
| logger.info("Signature Verification Results:") | ||
| for signer in verification_result['signers']: | ||
| logger.info(f" Signer {signer['index'] + 1}:") | ||
| if signer['verified']: | ||
| logger.info(" Status: VERIFIED") | ||
| else: | ||
| logger.info(" Status: FAILED") | ||
| if signer['error']: | ||
| logger.info(f" Error: {signer['error']}") | ||
|
|
||
| if verification_result['verified']: | ||
| logger.info("[+] Authenticated variable signature is VALID") | ||
| return 0 | ||
| else: | ||
| logger.error("[-] Authenticated variable signature verification FAILED") | ||
| for error in verification_result['errors']: | ||
| logger.error(f" - {error}") | ||
| return 1 | ||
|
|
||
| except Exception as e: | ||
| logger.error(f"Failed to verify authenticated variable: {e}") | ||
| if args.verbose: | ||
| import traceback | ||
| traceback.print_exc() | ||
| return 1 | ||
|
|
There was a problem hiding this comment.
The repository has comprehensive test coverage for other scripts (e.g., test_authenticode_transplant.py, test_validate_dbx_references.py), but there are no tests for the new verify_variable function or the related PKCS7 verification logic. Consider adding unit tests to validate the cryptographic verification functionality, especially for edge cases like invalid signatures, missing certificates, unsupported algorithms, and malformed PKCS7 data.
| subparsers = setup_format_parser(subparsers) | ||
| subparsers = setup_sign_parser(subparsers) | ||
| subparsers = setup_describe_parser(subparsers) | ||
| subparsers = setup_verify_parser(subparsers) |
There was a problem hiding this comment.
Variable subparsers is not used.
| subparsers = setup_format_parser(subparsers) | |
| subparsers = setup_sign_parser(subparsers) | |
| subparsers = setup_describe_parser(subparsers) | |
| subparsers = setup_verify_parser(subparsers) | |
| setup_format_parser(subparsers) | |
| setup_sign_parser(subparsers) | |
| setup_describe_parser(subparsers) | |
| setup_verify_parser(subparsers) |
| except (UnicodeDecodeError, AttributeError): | ||
| pass |
There was a problem hiding this comment.
'except' clause does nothing but pass and there is no explanatory comment.
| except (UnicodeDecodeError, AttributeError): | |
| pass | |
| except (UnicodeDecodeError, AttributeError) as e: | |
| # Failed to decode ASN.1 string content as UTF-8; fall back to original hex representation. | |
| logging.debug(f"Failed to decode ASN.1 string content from '{hex_string}': {e}") |
| except UnicodeDecodeError: | ||
| pass |
There was a problem hiding this comment.
'except' clause does nothing but pass and there is no explanatory comment.
| except UnicodeDecodeError: | |
| pass | |
| except UnicodeDecodeError as e: | |
| # Failed to decode as direct UTF-8; this is expected for non-textual values. | |
| logging.debug(f"Failed to directly decode hex string '{hex_string}' as UTF-8: {e}") |
|
@copilot open a pull request from this branch and implement the changes |
This pull request introduces a new cryptographic signature verification workflow for KEK update files and adds significant enhancements to the
auth_var_tool.pyscript, including a newverifycommand for authenticated variables and improved human-readable output for variable descriptions.Summary of most important changes:
CI/CD and Workflow Automation
.github/workflows/validate-kek-updates.yml) to automatically validate KEK update files in pull requests. This workflow checks for valid cryptographic signatures and expected payloads, uploads validation results, and comments on PRs if validation fails.Authenticated Variable Tool Enhancements
New Verification Feature
verifycommand toauth_var_tool.pythat allows users to verify the cryptographic signature of an authenticated variable. This includes PKCS7 signature structure validation, signature verification against signable data, and certificate checks. The command is fully integrated into the CLI and documented in the script's help and usage examples. [1] [2] [3] [4] [5]Improved Output for Describe Command
describecommand output by converting hex-encoded ASN.1 strings (such as certificate fields) into human-readable text where possible, making it easier to interpret variable contents. [1] [2]These changes improve both the automation and usability of KEK update validation and authenticated variable management in the codebase.## Description
<Include a description of the change and why this change was made.>
For details on how to complete these options and their meaning refer to CONTRIBUTING.md.
How This Was Tested
<Describe the test(s) that were run to verify the changes.>
Integration Instructions
<Describe how these changes should be integrated. Use N/A if nothing is required.>