Supply Chain Headache: TeamPCP Backdoors Checkmarx Jenkins AST Plugin
TeamPCP is really targeting the DevSecOps tooling space right now. Following the KICS supply chain hit, they've now compromised the Checkmarx Jenkins AST plugin.
Checkmarx confirmed that a modified version was pushed to the Jenkins Marketplace. The guidance is immediate: verify your version. You must be on 2.0.13-829.vc72453fa_1c16 (published Dec 17, 2025) or earlier. Anything newer should be treated as compromised until further notice.
Here is a quick Groovy script for the Jenkins Script Console to audit your instances immediately:
groovy Jenkins.instance.pluginManager.plugins.each { plugin -> if (plugin.shortName == "checkmarx-ast-plugin") { println "Found: ${plugin.displayName}" println "Version: ${plugin.version}" println "Active: ${plugin.isEnabled()}" if (plugin.version != "2.0.13-829.vc72453fa_1c16") { println "WARNING: Version matches IOC for compromise" } } }
Since TeamPCP is linked to PCPJack and cloud credential theft, I'd recommend cross-referencing your Jenkins logs with CloudTrail or Azure Monitor for any anomalous role assumption or secret access attempts during the window this plugin was active.
This raises a tough question about supply chain trust. How are you guys handling internal plugin repositories? Are you air-gapping your update centers or just pinning specific hashes?
We pin versions explicitly in our Dockerfiles using the jenkins-plugin-cli. It adds overhead to the update cycle, but it prevents exactly this kind of 'latest' tag poisoning. We also proxy the Jenkins update center through Artifactory so we can approve specific versions before they hit our prod build nodes.
Good catch on the Groovy script. For those who can't run scripts on the master node, you can also check the filesystem directly for the plugin manifest:
grep -E 'Plugin-Version|Short-Name' /var/jenkins_home/plugins/checkmarx-ast-plugin/META-INF/MANIFEST.MF
Given the PCPJack history, I'd also look for unexpected outbound Java processes from your Jenkins user.
It's ironic that a tool designed to find security flaws became the flaw. We saw similar behavior with the KICS incident. The attacker is clearly looking for high-privilege CI/CD environments to harvest cloud creds. If you find this plugin, assume your AWS/GCP keys are burned and rotate them immediately.
Validating the version is step one, but for compliance reporting, you'll need the timeline of exposure. Cross-reference your SBOM with the Jenkins logs to pinpoint the installation date of the malicious artifact. This helps define the scope of the breach. You can find installation events here:
zgrep "Checkmarx AST" /var/log/jenkins/jenkins.log* | grep "installed"
Don't forget to revoke any credentials managed within the Jenkins instance immediately if the bad version was active.
Once you identify the compromised plugin, don't just delete it. Check for persistence artifacts often left by TeamPCP, such as webshells dropped in the webapps directory. This command finds recently modified JSP files, which are common payloads in these CI/CD takeovers:
find /var/lib/jenkins/webapps -name "*.jsp" -mtime -30 -exec file {} \;
Has anyone observed outbound network traffic on port 4444 from their Jenkins masters?
Building on the verification steps, calculating the SHA-256 hash of the plugin JAR is the only way to be 100% sure the file hasn't been tampered with. If you have a known-good hash from the vendor, compare it against what's in your plugins directory. This prevents false positives if the attackers preserved the version string.
sha256sum /var/lib/jenkins/plugins/checkmarx-ast.jenkins-plugin
Verified Access Required
To maintain the integrity of our intelligence feeds, only verified partners and security professionals can post replies.
Request Access