Integrating Broadcom Test Data Manager Logs with Splunk: A Step-by-Step Guide
Properly managed test data is crucial for accurate and reliable software testing, as it allows development teams to simulate high-fidelity real-world scenarios, identify critical issues earlier, and deliver robust software that more readily meets user requirements. However, test data management isn’t always straightforward. It often requires considerable time and resources to generate and maintain datasets that are representative, diverse, up-to-date, and secure.
Broadcom Test Data Manager (TDM) helps today’s agile teams overcome these challenges by allowing them to more easily manage their test data. And when integrated with TDM logs, Splunk enables real-time search, monitoring, and analysis for automated test data management at scale, giving teams broader data observability through Splunk’s powerful analytics and visualizations.
In this guide, you’ll learn how to load Broadcom TDM logs into the Splunk platform. By creating a Python script to process and ingest TDM logs for Splunk, you can make your test data logs easier to search, analyze for errors, and visualize for real-time monitoring and insights.
What Is Broadcom TDM?
Broadcom TDM is a solution that helps development teams and partners create and manage synthetic data for various testing purposes in their software development lifecycle (SDLC) processes. For example, teams can use TDM to create fit-for-purpose datasets for security and compliance validation, fill gaps in test data coverage, and create future scenarios and unexpected results to test boundary conditions, to name a few use cases.
Here are Broadcom TDM’s key features:
Synthetic Data Creation
TDM enables software teams to create synthetic test data from scratch using sophisticated coverage analysis, allowing for the smallest datasets required for comprehensive testing. With synthetic data, developers can test future scenarios and unexpected results—particularly values at the edge of input ranges, where errors are most likely to occur.
This approach helps ensure that the software behaves as expected within the limits of allowable input, reducing the risk of costly delays and production errors.
Data Discovery and Profiling
TDM’s Discovery and Profiling feature provides data sampling and discovery capabilities that enable teams to profile data within a software project. These functionalities are crucial for identifying personally identifiable information (PII) across multiple data sources, ensuring that data privacy and compliance issues are surfaced and addressed as early as possible.
Data Masking and Virtual Test Data
Teams can use TDM to quickly provision virtual copies of test data for each tester on demand. This enables development and testing teams to deliver higher-quality software to market faster, at a lower cost, and with fewer bugs and deployment issues.
TDM also provides a scalable PII masking feature that quickly and efficiently masks large data volumes, allowing teams to demonstrate continuous compliance with data privacy regulations without additional administrative or configuration overhead.
Data Storage and Reuse
TDM allows software teams to store test data as centralized, reusable assets that can be cloned and provisioned in minutes to match specific tests and use cases. Data models and rules are also stored in a central repository, minimizing redundant efforts and maximizing the value of existing work.
Automated Test Data Management and Agile Software Development with Broadcom TDM
Comprehensive testing is essential to high-quality software, as it thoroughly evaluates all aspects of the application to uncover potential bugs, performance bottlenecks, and usability issues before they reach end users. Automated test data management is a critical component of comprehensive testing, enabling teams to scale their efforts efficiently and cost-effectively while achieving better results.
Agile software development relies on iterative, customer-centric, adaptive tactics to deliver high-quality software quickly and at a lower cost. TDM enhances data management in agile processes by reducing data volumes, test durations and costs, and overall development lifecycle time.
Test data preparation efforts that extend beyond the typical two-week sprint cycle can disrupt a development team’s cadence—TDM helps avoid such delays by providing test data on demand, in seconds, for end-to-end application development and validation.
Why Integrate TDM with Splunk?
Integrating Broadcom TDM logs with Splunk offers significant advantages in test data management. Splunk enhances data visibility by providing powerful analytics and dashboards, giving your software development team deeper insights into your test data logs. This integration supports more efficient error monitoring and troubleshooting processes, improved compliance tracking with real-time insights, and timely access to accurate test data, helping your team align with the agile development schedule and cadence.
Broadcom TDM-Splunk Integration: Prerequisites
The following items are required for integrating your Broadcom TDM logs with Splunk:
- Splunk Enterprise
- Broadcom TDM log files
- Python IDE or code editor (e.g., PyCharm or Visual Studio Code)
- Some proficiency with Python
Please note that this guide specifically covers integrating Broadcom TDM log files with Splunk Enterprise (not Splunk Cloud) on a Windows-based system.
Loading TDM into Splunk: A Step-by-Step Guide
The following steps will show you how to set up Splunk to ingest TDM log files using a custom Python script.
Setting Up Splunk
To configure your instance of Splunk Enterprise to ingest TDM log files, first create a new app. Navigate to the Apps section of Splunk Enterprise and click on the Create app button on the top-right side.

On the Add new page, provide a name for your new app (e.g., “Insert TDM Logs”).
Under Folder name, specify the app directory where your app will be located (e.g., “Insert_TDM_Logs”). Take note of the app’s directory path under Folder name—this is where you will place your custom Python script, in the bin subdirectory. The remaining fields are optional and can be left blank for now.

Click the Save button to add your new app. Once the app has been saved, you will be redirected to the Apps page, where you can verify that your new app was created successfully.

Writing the Python Script to Process TDM Logs
The following Python script processes your TDM log files by first loading the log file from the folder specified in the dir variable (e.g., “C:\\data\\Tdm_logs\\logs\\”). The script then performs a pattern match for the word “ERROR” and outputs the results to a JSON-formatted file that Splunk can ingest and analyze.
import re import os import json import sys def filter_errors(log_line, filename): pattern = r"\[ERROR\]" debug_string_holder = {} match = re.search(pattern, log_line) if match: pattern = r"\s+" string_splits = re.split(pattern, log_line) debug_string_holder["Data"] = string_splits[0] + " " + string_splits[1] debug_string_holder["Type"] = string_splits[3] debug_string_holder["Message"] = " ".join(string_splits[4:]) if len(debug_string_holder) != 0: json_object = json.dumps(debug_string_holder) print(json_object) dir = "C:\\data\\Tdm_logs\\logs\\" for filename in os.listdir(dir): if filename.endswith(".log"): with open(dir+filename, 'r') as f: for line in f: filter_errors(line, filename)
To modify the Python code for your system, copy and paste the Python code into your IDE or code editor, and replace the value of the dir variable with the path to your TDM log files. If you’re using Windows, remember to include an extra backslash to properly escape each backslash.
Save the Python script to the bin subdirectory of your app’s directory path (i.e., the Folder Name you specified earlier when creating the app) as “import_TDM_Debug_logs.py.” If using another file name, be sure to include the “.py” extension to identify it as a Python script.
Automating Log Ingestion into Splunk
Once you have created your Python script, go back to the Apps page in Splunk Enterprise. Under Actions, click the Launch app link for the “Insert TDM Logs” app you created earlier.
In the top-level navigation, under Settings, click on Data inputs (under “DATA”).
On the Data inputs page, click the Scripts link to add your Python script to your app.

You can configure your Python script’s settings on the Add Data page. Under Script Path, navigate to the bin directory where your Python script is saved.
For Script Name and Command, you can accept the default values provided by Splunk. You may also provide custom values for Interval Input and Interval to set the timing parameters that Splunk will use when executing your script.

Click Next to continue to the Input Settings page. For Source Type, select “Structured” and “json_no_timestamp.” For App Context, make sure that the “Insert TDM Logs (Insert_TDM_Logs)” is selected. You can accept the default values for the other fields and click Next.

After clicking the Review and Submit buttons, your script input will be created. Click the Start Searching button to view your TDM error logs ingested into Splunk.

You can now view your TDM error logs in Splunk, filter by selected and interesting fields, and view Statistics and Visualizations by clicking on the respective tabs under the search pane.
Enhancing Test Data Management with Broadcom TDM and Splunk
By following this guide, you can equip your development team with streamlined testing processes using Broadcom TDM logs and Splunk Enterprise, enabling real-time search, monitoring, and analysis for automated test data management at scale, along with data observability through analytics and visualizations.
For more information about integrating Test Data Manager by Broadcom or additional support, contact A&I Solutions with your questions today.
- On October 30, 2024
- 0 Comment