Go to main content
Formats
Format
BibTeX
MARCXML
TextMARC
MARC
DataCite
DublinCore
EndNote
NLM
RefWorks
RIS

Files

Abstract

Given the foundational role of system requirements in design projects, designers can benefit from classifying, comparing, and observing connections between requirements. Manually undertaking these processes, however, can be laborious and time-consuming. Previous studies have employed Bidirectional Encoder Representations from Transformers (BERT), a natural language processing model, to automatically analyze written requirements. Yet, it remains unclear whether BERT can capture the nuances that differentiate requirements. This work evaluates BERT’s performance on two requirement classification tasks executed on five system design documents. First, a BERT model is fine-tuned to classify requirements according to their originating project. A separate BERT model is then fine-tuned to classify each requirement as either functional or nonfunctional. The former model receives a Matthews correlation coefficient (MCC) of 0.95, while the latter receives an MCC of 0.82. This work then explores the application of BERT’s requirement representations to identify similar requirements and predict requirement change.

Details

PDF

Statistics

from
to
Export
Download Full History