...
Multi-language Kit is available for localization.
Direct execution of all features provided by Security Reviewer Suite (SAST, DAST, SCA, Mobile, Firmware)
Extended Workflow and Reporting features, GDPR Compliance Level included
Performant database, based on MariaDB 10.x Galera cluster. It can be changed to Oracle RAC 12 or any other Supported Relational Database
Secured Source code and Operation platform, due to an accurate Static Code Review and Dynamic Analysis made by Security Reviewer and Dynamic Reviewer tools
Encryption of DB Tables containing sensitive data (Users, Groups, Applications, Workflow, Policies, etc.)
TEnhanced support for third-party SAST, IAST, DAST and Netowrk Scans tools.
Mobile Behaviorial Analysis integration (Mobile Reviewer)
Software Composition Analysis (SRA) integration
Software Resilience Analysis (SCA) Integration
Firmware Reviewer Single Sign On
SQALE, OWASP Top Ten 2017, Mobile Top Ten 2016, CWE, CVE, WASC, CVSSv2, CVSSv3.1 and PCI-DSS 3.2.1 Compliance
Application Portfolio Management tools integration
Models
Static Server Plugin
Static Server Plugin for Team Reviewer attempts to simplify how users interact with the system by minimizing the number of objects it defines. The definition for each as well as sample usages is below.
...
Products
Any application, project, program, or product that you are currently analyzing.
Engagement
Engagements are moments in time when testing is taking place, aka as Audit. They are associated with a name for easy reference, a time line, a lead (the user account of the main person conducting the testing), a test strategy, and a status.
Engagement consists of two types: Interactive and CI/CD.
An interactive engagement is typically an engagement conducted by an engineer, where findings are usually uploaded by the engineer.
A CI/CD engagement, as it’s name suggests, is for automated integration with a CI/CD pipeline
...
Each Engagement can include several Tests.
...
You can view the Test Strategy or Threat Model, modify the Engagement dates, view Tests and Findings, add Risk Acceptance, complete the security Check List, or close the Engagement.
...
Engagements are linked to a time line in the Calendar.
...
Engagement Survey
Engagement Survey extends Engagement records by incorporating survey(s) associated with each engagement to help develop a test strategy.
The default questions within these surveys have been created by the Rackspace Security Engineering team to help identify the attack vectors and risks associated with the product being assessed.
GDPR, Static Analysis and Dynamic Analysis cutomizable Surveys are also available.
Findings
A finding represents a flaw discovered while testing. It can be categorized with severities of Critical, High, Medium, Low, and Informational (Info).
...
Each Finding get a Unique ID and a Status.
Findings are the defects or interesting things that you want to keep track of when testing a Product during a Test/Engagement. Here, you can lay out the details of what went wrong, where you found it, what the impact is, and your proposed steps for mitigation.
You can Force, if authorized, the Status, Severity, Risk Level. This operation will be tracked in a special log and can be viewed by authorized users.
You can Filter by: ID, Application, Severity, Finding Name, Date range, SLA, Auditor (Reporter, Found By), Status, Risk Level, N. of Vulnerabilities.
You can also Reference CWEs, or add links to your own references. (External Ddocumentation Links included).
...
Templating findings allows you to create a version of a finding descriptor that you can then re-use over and over again, on any Engagement.
False Positive and Duplicates
Templates can be used across all Engagements. Define what kind of Finding this is. Is it a false positive? A duplicate? If you want to save this finding as a template, check the “Is template” box.
...
Findings cannot always be remediated or addressed for various reasons.
A Finding Status can change to accepted by doing the following. Findings are accepted in the engagement view.
...
You fill in the details to support the risk acceptance.
...
De-duplication is a feature that when enabled will compare findings to automatically identify duplicates.
To enable de-duplication go to System Settings and check Deduplicate findings.
Team Reviewer deduplicates findings by comparing endpoints, CWE fields, and titles. If a two findings share a URL and have the same CWE or title,
Team Reviewer marks the less recent finding as a duplicate. When de-duplication is enabled, a list of deduplicated findings is added to the engagement view.
The following image illustrates the option deduplication on engagement and deduplication on product level:
...
Visual representation:
...
While viewing a finding, similar findings within the same product are listed along with buttons to mark one finding a duplicate of the other.
Clicking the “Use as original” button on a similar finding will mark that finding as the original while marking the viewed finding as a duplicate.
Clicking the “Mark as duplicate” button on a similar finding will mark that finding as a duplicate of the viewed finding.
If a similar finding is already marked as a duplicate, then a “Reset duplicate status” button is shown instead which will remove the duplicate status on that finding along with marking it active again.
...
Product Types
Product types represent the top level model, these can be business unit divisions, different offices or locations, development teams, or any other logical way of distinguishing “types” of products.
...
Environments
These describe the environment that was tested in a particular Test.
Examples
Production
Staging
Stable
Development
...
Test Types
These can be any sort of distinguishing characteristic about the type of testing that was done in an Engagement.
Examples
Functional
Security
Nessus Scan
API test
Static Analysis
...
Metrics
Tracking metrics for your Products can help you identify Products that may need additional help, or highlight a particularly effective member of your team.
You can also see the Dashboard view, a page that scrolls automatically, showing off the results of your testing.
This can be useful if you want to display your team’s work in public without showing specific details.
...
Benchmarks
Team Reviewer utilizes the OWASP ASVS Benchmarks to benchmark a product to ensure the product meets your application technical security controls.
Benchmarks can be defined per the organizations policy for secure development and multiple benchmarks can be applied to a product.
...
Benchmarks are available from the Product view.
In the Benchmarks view for each product, the default level is ASVS Level 1., but can be changed to the desired ASVS level (Level 1, Level 2 or Level 3).
Further, it will display the ASVS score on the product page and this will be applied to reporting.
...
On the left hand side the ASVS score is displayed with the desired score, the % of benchmarks passed to achieve the score and the total enabled benchmarks for that AVSV level.
Static Server Plugin
Static Server Plugin for Team Reviewer (to be purchased separately) is able to run Static Analyses over a Source Code Folder, directly from Team Reviewer.
You can do:
Static Analyses
Mark False Positives
Enable/Disable and change Severity of existing Vulnerability Detection Rules
Add Custom Rules
Declare Recurrent False Positives by Evidence
You start Source Code Inspections by clicking Static Analysis in the main Dashboard:
...
The Static Analysis features are the same of Static Reviewer Desktop, but centralized and accessible by any browser:
...
You can massively mark False Positives using our smart interface:
...
You can Enable/Disable and change Severity of existing Vulnerability Detection Rules (authorized users only):
...
You can create your Custom Rules (authorized users only):
...
You can declare Recurring False Positives by Evidence (authorized users only):
...
SCA Server Plugin
SCA Server Plugin for Team Reviewer (to be purchased separately) is able to run Software Composition Analyses, directly from Team Reviewer.
You can do:
Software Composition Analysis of a Folder, a Container or a GIT Repository containing 3-party libraries
It will discover:
Blacklisted Libraries: Versions not admitted inside the organization
License Conflict: Licenses that cannot coexist with others
Outdated Libraries: Libraries or Frameworks created by a very old, unsupported JDK or .NET Framework version
Discontinued Libraries: Libraries or Frameworks abandoned by the Developer's Community
Vulnerable Frameworks: Frameworks having at least one vulnerable library
Suspicious Licenses: Licenses information that has been manipulated
Poor-man Copyright: Self-declared Copyright
Vulnerable Libraries: Vulnerable libraries that must be replaced by newer, secure versions
You start a Software Composition Analysis by clicking SCA Analysis in the main Dashboard:
...
The Software Composition Analysis features are the same of SCA Desktop, but centralized and accessible by any browser:
...
Once the SCA analysis is terminated you can go to Results page:
...
You can drill-down the results Details:
...
You can view the Software Bill of Materials (SBOM):
...
And you can download Reports in PDF, JSON, Excel and HTML formats:
...
Additionally, you can have a custom Cover Letter, with your logo, your ISO 9001 Responsability chain, the Confidentiality Level and your Disclaimer.
Reports
Team Reviewer stores reports generated with:
Static Reviewer Desktop
Static Reviewer CI/CD plugins for Jenkins and GitLab
SCA Reviewer Destkop
SCA Reviewer CI/CD plugins for Jenkins and GitLab
Dynamic Reviewer
Mobile Reviewer
...
Further, you can create your own custom reports by using Team Reviewer Report Generator.
...
Reports can be generated for:
Groups of Products
Individual Products
Endpoints
Product Types
Custom Reports
...
Filtering is available on all Report Generation views to aid in focusing the report for the appropriate need.
Custom reports allow you to select specific components to be added to the report. These include:
Cover Page
Table of Contents
WYSIWYG Content
Findings List
Endpoint List
Page Breaks
The custom report workflow takes advantage of the same asynchronous process described above
Notifications
Team Reviewer can inform you of different events in a variety of ways. You can be notified about things like an upcoming engagement, when someone mentions you in a comment, a scheduled report has finished generating, and more.
The following notification methods currently exist: - Email - Slack - HipChat - WebHook or Alerts within Team Reviewer
...
You can set these notifications on a system scope (if you have administrator rights) or on a personal scope. For instance, an administrator might want notifications of all upcoming engagements sent to a certain Slack channel, whereas an individual user wants email notifications to be sent to the user’s specified email address when a report has finished generating.
In order to identify and notify you about things like upcoming engagements, Team Reviewer runs scheduled tasks for this purpose.
Attached Documents
Products, Engagements and Tests permit to attach one or more documents, like Requirements docs, Project Docs, Evidences, Certifications, Risk Acceptances and any correlated docs you need.
It accepts PDF, Word, Excel and Images file formats.
Security Reviewer’s Security, Deadcode-Best Practices, Resilience and SQALE reports are uploaded as Engagement’s Attached Documents to Team Reviewer using REST APIs.
Results Correlation
Team Reviewer can import and correlate results from the following tools:
Static Reviewer, Security Reviewer Software Composition Analysis (SCA), Security Reviewer Software Resilience Analysis (SRA), Mobile Reviewer and Dynamic Reviewer XML or CSV
HCL AppScan Source ed. and Standard ed. detailed XML Report
Micro Focus Fortify SCA and WebInspect FPR
CA Veracode Detailed XML Report
Checkmarx Detailed XML Report
Rapid7 AppSpider Vulnerabilities Summary XML Report and Nexpose XML 2.0
Acunetix
Anchore
AQUA
Arachni Scanner JSON Report
AWS Prowler and Scout2
Bandit
Synopsys BlackDuck
Brakeman
BugCrowd
Contrast
ESLint
GitLab SAST
GitLeaks
GOast
GOSec
HadoLink
HuskyCI
ImmuniWeb
JFrog XRay
Kiuwan
Burp Suite XML
Nessus (CSV, XML)
NetSparker
NExspose
NPMAudit
OpenSCAP
OpenVAS
PHP Symphony Security Check
Nmap (XML), SQLMap, NoSQLMap (text output)
OWASP ZAP XML and Dependency Check XML
Retire.js JavaScript Scan JSON
Node Security Platform JSON
Qualys XML
SonarQube
Sonatype Nexus
SourceClear
SSLScan
SSLlyze
Snyk JSON
Trivy
Trustwave
PyJFuzz
WhiteSource
WpScan
Generic Findings in CSV format
Team Reviewer can export correlated results to the following tools:
SonarQube
Micro Focus Fortify SSC
Kenna Security
ThreadFix
ServiceNow
See our EcoSystem.
Team Reviewer can access to Firmware Reviewer using Single Sign On.
Authentication via LDAP/AD
LDAP (Lightweight Directory Access Protocol) is an Internet protocol that web applications can use to look up information about those users and groups from the LDAP server. You can connect the Team Reviewer to an LDAP directory for authentication, user and group management. Connecting to an LDAP directory server is useful if user groups are stored in a corporate directory. Synchronization with LDAP allows the automatic creation, update and deletion of users and groups in Team Reviewer according to any changes being made in the LDAP directory (to be purchased separately) is able to run Static Analyses over a Source Code Folder, directly from Team Reviewer.
You can do:
Static Analyses
Mark False Positives
Enable/Disable and change Severity of existing Vulnerability Detection Rules
Add Custom Rules
Declare Recurrent False Positives by Evidence
You start Source Code Inspections by clicking Static Analysis in the main Dashboard:
...
The Static Analysis features are the same of Static Reviewer Desktop, but centralized and accessible by any browser:
...
You can massively mark False Positives using our smart interface:
...
You can Enable/Disable and change Severity of existing Vulnerability Detection Rules (authorized users only):
...
You can create your Custom Rules (authorized users only):
...
You can declare Recurring False Positives by Evidence (authorized users only):
...
SCA Server Plugin
SCA Server Plugin for Team Reviewer (to be purchased separately) is able to run Software Composition Analyses, directly from Team Reviewer.
You can do:
Software Composition Analysis of a Folder, a Container or a GIT Repository containing 3-party libraries
It will discover:
Blacklisted Libraries: Versions not admitted inside the organization
License Conflict: Licenses that cannot coexist with others
Outdated Libraries: Libraries or Frameworks created by a very old, unsupported JDK or .NET Framework version
Discontinued Libraries: Libraries or Frameworks abandoned by the Developer's Community
Vulnerable Frameworks: Frameworks having at least one vulnerable library
Suspicious Licenses: Licenses information that has been manipulated
Poor-man Copyright: Self-declared Copyright
Vulnerable Libraries: Vulnerable libraries that must be replaced by newer, secure versions
You start a Software Composition Analysis by clicking SCA Analysis in the main Dashboard:
...
The Software Composition Analysis features are the same of SCA Desktop, but centralized and accessible by any browser:
...
Once the SCA analysis is terminated you can go to Results page:
...
You can drill-down the results Details:
...
You can view the Software Bill of Materials (SBOM):
...
And you can download Reports in PDF, JSON, Excel and HTML formats:
...
Additionally, you can have a custom Cover Letter, with your logo, your ISO 9001 Responsability chain, the Confidentiality Level and your Disclaimer.
Reports
Team Reviewer stores reports generated with:
Static Reviewer Desktop
Static Reviewer CI/CD plugins for Jenkins and GitLab
SCA Reviewer Destkop
SCA Reviewer CI/CD plugins for Jenkins and GitLab
Dynamic Reviewer
Mobile Reviewer
...
Further, you can create your own custom reports by using Team Reviewer Report Generator.
Team Reviewer custom reports can be generated in Word, Excel, XML, HTML,and AsciiDoc. If you need different formats, open the Word reports and choose Save As…
...
Reports can be generated for:
Groups of Products
Individual Products
Endpoints
Product Types
Custom Reports
...
Filtering is available on all Report Generation views to aid in focusing the report for the appropriate need.
Custom reports allow you to select specific components to be added to the report. These include:
Cover Page
Table of Contents
WYSIWYG Content
Findings List
Endpoint List
Page Breaks
The custom report workflow takes advantage of the same asynchronous process described above
Notifications
Team Reviewer can inform you of different events in a variety of ways. You can be notified about things like an upcoming engagement, when someone mentions you in a comment, a scheduled report has finished generating, and more.
The following notification methods currently exist: - Email - Slack - HipChat - WebHook or Alerts within Team Reviewer
...
You can set these notifications on a system scope (if you have administrator rights) or on a personal scope. For instance, an administrator might want notifications of all upcoming engagements sent to a certain Slack channel, whereas an individual user wants email notifications to be sent to the user’s specified email address when a report has finished generating.
In order to identify and notify you about things like upcoming engagements, Team Reviewer runs scheduled tasks for this purpose.
Attached Documents
Products, Engagements and Tests permit to attach one or more documents, like Requirements docs, Project Docs, Evidences, Certifications, Risk Acceptances and any correlated docs you need.
It accepts PDF, Word, Excel and Images file formats.
Security Reviewer’s Security, Deadcode-Best Practices, Resilience and SQALE reports are uploaded as Engagement’s Attached Documents to Team Reviewer using REST APIs.
Results Correlation
Team Reviewer can import and correlate results from the following tools:
Static Reviewer, Security Reviewer Software Composition Analysis (SCA), Security Reviewer Software Resilience Analysis (SRA), Mobile Reviewer and Dynamic Reviewer XML or CSV
HCL AppScan Source ed. and Standard ed. detailed XML Report
Micro Focus Fortify SCA and WebInspect FPR
CA Veracode Detailed XML Report
Checkmarx Detailed XML Report
Rapid7 AppSpider Vulnerabilities Summary XML Report and Nexpose XML 2.0
Acunetix
Anchore
AQUA
Arachni Scanner JSON Report
AWS Prowler and Scout2
Bandit
Synopsys BlackDuck
Brakeman
BugCrowd
Contrast
ESLint
GitLab SAST
GitLeaks
GOast
GOSec
HadoLink
HuskyCI
ImmuniWeb
JFrog XRay
Kiuwan
Burp Suite XML
Nessus (CSV, XML)
NetSparker
NExspose
NPMAudit
OpenSCAP
OpenVAS
PHP Symphony Security Check
Nmap (XML), SQLMap, NoSQLMap (text output)
OWASP ZAP XML and Dependency Check XML
Retire.js JavaScript Scan JSON
Node Security Platform JSON
Qualys XML
SonarQube
Sonatype Nexus
SourceClear
SSLScan
SSLlyze
Snyk JSON
Trivy
Trustwave
PyJFuzz
WhiteSource
WpScan
Generic Findings in CSV format
Team Reviewer can export correlated results to the following tools:
SonarQube
Micro Focus Fortify SSC
Kenna Security
ThreadFix
ServiceNow
See our EcoSystem.
Team Reviewer can access to Firmware Reviewer using Single Sign On.
Authentication via LDAP/AD
LDAP (Lightweight Directory Access Protocol) is an Internet protocol that web applications can use to look up information about those users and groups from the LDAP server. You can connect the Team Reviewer to an LDAP directory for authentication, user and group management. Connecting to an LDAP directory server is useful if user groups are stored in a corporate directory. Synchronization with LDAP allows the automatic creation, update and deletion of users and groups in Team Reviewer according to any changes being made in the LDAP directory.
Models
Team Reviewer attempts to simplify how users interact with the system by minimizing the number of objects it defines. The definition for each as well as sample usages is below.
...
Products
Any application, project, program, or product that you are currently analyzing.
Engagement
Engagements are moments in time when testing is taking place, aka as Audit. They are associated with a name for easy reference, a time line, a lead (the user account of the main person conducting the testing), a test strategy, and a status.
Engagement consists of two types: Interactive and CI/CD.
An interactive engagement is typically an engagement conducted by an engineer, where findings are usually uploaded by the engineer.
A CI/CD engagement, as it’s name suggests, is for automated integration with a CI/CD pipeline
...
Each Engagement can include several Tests.
...
You can view the Test Strategy or Threat Model, modify the Engagement dates, view Tests and Findings, add Risk Acceptance, complete the security Check List, or close the Engagement.
...
Engagements are linked to a time line in the Calendar.
...
Engagement Survey
Engagement Survey extends Engagement records by incorporating survey(s) associated with each engagement to help develop a test strategy.
The default questions within these surveys have been created by the Rackspace Security Engineering team to help identify the attack vectors and risks associated with the product being assessed.
GDPR, Static Analysis and Dynamic Analysis cutomizable Surveys are also available.
Findings
A finding represents a flaw discovered while testing. It can be categorized with severities of Critical, High, Medium, Low, and Informational (Info).
...
Each Finding get a Unique ID and a Status.
Findings are the defects or interesting things that you want to keep track of when testing a Product during a Test/Engagement. Here, you can lay out the details of what went wrong, where you found it, what the impact is, and your proposed steps for mitigation.
You can Force, if authorized, the Status, Severity, Risk Level. This operation will be tracked in a special log and can be viewed by authorized users.
You can Filter by: ID, Application, Severity, Finding Name, Date range, SLA, Auditor (Reporter, Found By), Status, Risk Level, N. of Vulnerabilities.
You can also Reference CWEs, or add links to your own references. (External Ddocumentation Links included).
...
Templating findings allows you to create a version of a finding descriptor that you can then re-use over and over again, on any Engagement.
False Positive and Duplicates
Templates can be used across all Engagements. Define what kind of Finding this is. Is it a false positive? A duplicate? If you want to save this finding as a template, check the “Is template” box.
...
Findings cannot always be remediated or addressed for various reasons.
A Finding Status can change to accepted by doing the following. Findings are accepted in the engagement view.
...
You fill in the details to support the risk acceptance.
...
De-duplication is a feature that when enabled will compare findings to automatically identify duplicates.
To enable de-duplication go to System Settings and check Deduplicate findings.
Team Reviewer deduplicates findings by comparing endpoints, CWE fields, and titles. If a two findings share a URL and have the same CWE or title,
Team Reviewer marks the less recent finding as a duplicate. When de-duplication is enabled, a list of deduplicated findings is added to the engagement view.
The following image illustrates the option deduplication on engagement and deduplication on product level:
...
Visual representation:
...
While viewing a finding, similar findings within the same product are listed along with buttons to mark one finding a duplicate of the other.
Clicking the “Use as original” button on a similar finding will mark that finding as the original while marking the viewed finding as a duplicate.
Clicking the “Mark as duplicate” button on a similar finding will mark that finding as a duplicate of the viewed finding.
If a similar finding is already marked as a duplicate, then a “Reset duplicate status” button is shown instead which will remove the duplicate status on that finding along with marking it active again.
...
Product Types
Product types represent the top level model, these can be business unit divisions, different offices or locations, development teams, or any other logical way of distinguishing “types” of products.
...
Environments
These describe the environment that was tested in a particular Test.
Examples
Production
Staging
Stable
Development
...
Test Types
These can be any sort of distinguishing characteristic about the type of testing that was done in an Engagement.
Examples
Functional
Security
Nessus Scan
API test
Static Analysis
...
Metrics
Tracking metrics for your Products can help you identify Products that may need additional help, or highlight a particularly effective member of your team.
You can also see the Dashboard view, a page that scrolls automatically, showing off the results of your testing.
This can be useful if you want to display your team’s work in public without showing specific details.
...
Benchmarks
Team Reviewer utilizes the OWASP ASVS Benchmarks to benchmark a product to ensure the product meets your application technical security controls.
Benchmarks can be defined per the organizations policy for secure development and multiple benchmarks can be applied to a product.
...
Benchmarks are available from the Product view.
In the Benchmarks view for each product, the default level is ASVS Level 1., but can be changed to the desired ASVS level (Level 1, Level 2 or Level 3).
Further, it will display the ASVS score on the product page and this will be applied to reporting.
...
On the left hand side the ASVS score is displayed with the desired score, the % of benchmarks passed to achieve the score and the total enabled benchmarks for that AVSV level.
REST API
Team Reviewer is built using a thin server architecture and an API-first design. API’s are simply at the heart of the platform. Every API is fully documented via Swagger 2.0.
...