nec投影仪是什么牌子| 什么动物没有心脏| 腰肌劳损是什么症状| 酸西地那非片是什么药| 胃难受是什么原因| 经期适合吃什么食物| 艾滋病是什么病| 牛排用什么油煎好吃| 10.28什么星座| zara是什么品牌| 月经量多是什么原因引起的| loa是什么胎位| 怀疑哮喘要做什么检查| 双手抱在胸前代表什么| 拉肚子可以吃什么水果| 什么上树全靠一张嘴| 甲硝唑有什么副作用| 什么人不能吃绿豆| 爱豆是什么意思| 死库水是什么意思| 居住证签注是什么意思| 什么花适合室内养| 竖心旁的字和什么有关| 免冠照片是什么意思| 重庆有什么特产| 鹌鹑吃什么| 胸有成竹是什么生肖| 女孩叫兮兮是什么意思| 男孩叫什么名字| 心脏不舒服吃什么药| 不发烧流鼻涕打喷嚏吃什么药| 语素是什么| 柏油是什么| 生态皮是什么材质| 可拉明又叫什么| 劲酒是什么酒| 牛肉跟什么炒好吃| 缺铁性贫血的人吃什么补血最快| 红颜知己是什么关系| 耳朵长痣代表什么| 吃brunch是什么意思啊| 过敏性紫癜是什么病| 左右逢源是什么生肖| 非萎缩性胃炎伴糜烂是什么意思| 乌龟最喜欢吃什么| 乳房发痒什么原因| 巨峰葡萄为什么叫巨峰| 左侧肋骨下方疼痛是什么原因| 昝是什么意思| 雅五行属性是什么| 牙龈肿了吃什么消炎药| 水牛背满月脸是什么病| 凤梨是什么| 求人办事送什么礼物好| 癫痫病是什么症状| 做完胃肠镜后可以吃什么| 腺管瘤是什么| 感恩节为什么要吃火鸡| 吃葡萄干对身体有什么好处| 新生儿嘴唇发紫是什么原因| 什么水果补钙| 你要什么| 蜻蜓吃什么食物| 什么是德训鞋| 弥漫性肝病是什么意思| 狗是什么意思| 宁属于五行属什么| 事急从权是什么意思| 黄鼠狼为什么怕鹅| 右眼皮上长痣代表什么| 为什么女娲是一条蛇| 维生素D有什么食物| 转氨酶高是什么原因引起的| 不值一提是什么意思| 宫外孕破裂有什么症状| 老年人反复发烧是什么原因引起的| 大运是什么| 血红蛋白低吃什么可以补起来| 天热吃什么| 梦到自行车丢了是什么意思| 120是什么意思| 颈椎酸胀是什么原因| 淋巴结肿大挂什么科| 推是什么意思| 细小是什么病什么症状| 小孩小便红色是什么原因| 不经历风雨怎能见彩虹是什么意思| 红裤子配什么上衣| 人为什么会长智齿| cn是什么单位| 载脂蛋白b高是什么原因| 腔隙脑梗吃什么药最好| 治类风湿用什么方法好| 生意兴隆是什么意思| 为什么摩羯女颜值都高| 喝温开水有什么好处| 化疗前吃什么补充营养| 形而上学什么意思| 2月6号是什么星座| 黄晓明的老婆叫什么名字| 什么是政策| 努尔哈赤姓什么| 欧巴桑是什么意思| 三妻四妾是什么意思| 近视吃什么改善视力| 狗狗不能吃什么水果| 胃溃疡是什么意思| 痔疮是什么原因| 血小板减少是什么原因造成的| 乙醇对人体有什么伤害| 胃溃疡适合吃什么食物| 做梦掉牙齿是什么意思周公解梦| 台湾什么时候收回| tj什么意思| 太阳穴长痘是什么原因| 果酱样大便见于什么病| 什么叫造口| 二甲双胍有什么副作用| 养肝护肝喝什么茶最好| 水晶消磁是什么意思| 丧尸是什么意思| 干燥综合症挂什么科| 酵母菌属于什么菌| mds是什么| 病理是什么意思| chd医学上是什么意思| 侍妾是什么意思| 婴儿眼屎多是什么原因| 步履匆匆的意思是什么| crs是什么意思| 茶色是什么颜色| lgbtq是什么意思| 饭后放屁多是什么原因| 第二视角是什么意思| 兔和什么相冲| soeasy是什么意思| 什么药膏可以去黑头| 金丝檀木是什么木| 梦见嫖娼是什么意思| 东方明珠什么时候亮灯| 雾霾是什么意思| 六亲不认是什么生肖| 十五年是什么婚| 血白细胞高是什么原因| 内痔是什么意思| 屏保是什么| 硬度不够吃什么药调理| 胃反酸烧心吃什么药| 讨厌是什么意思| 什么猪没有嘴| 火六局是什么意思| 身份证后面有个x是什么意思| 虫字旁与什么有关| 千张炒什么好吃| 尿葡萄糖是什么意思| 胃胀气打嗝是什么原因| 使婢差奴过一生是什么意思| 单宁是什么| 唾液分泌过多是什么原因| 突然肚子疼是什么原因| 持之以恒的恒是什么意思| ct挂号挂什么科| 预谋什么意思| amiri是什么牌子| 1月9号是什么星座| 可以组什么词语| 睡觉口干是什么原因| 保育费是什么意思| 5月17日是什么星座| 瘦人吃什么长胖| 大什么大| 辛辣都包括什么| 佩戴沉香有什么好处| 嫦娥奔月是什么节日| 小赤佬是什么意思| gamma什么意思| 脸部下垂什么方法提升效果好| 眼皮发黑是什么原因| 菊花茶喝多了有什么坏处| 十月30号是什么星座| 蚰蜒是什么| 吃黄瓜有什么好处和坏处| 拔罐对身体有什么好处和坏处| 支气管炎吃什么药最好| vans什么意思| 肺结节吃什么中成药| 锲而不舍是什么生肖| 占卜是什么意思| 女人吃当归有什么好处| 升米恩斗米仇什么意思| 落子是什么意思| 睡觉时头晕是什么原因| 有什么黄色网站| 头晕眼睛模糊是什么原因| 牛是什么意思| 四十年婚姻是什么婚| 风热感冒什么症状| 青少年额头长痘痘是什么原因| 神经衰弱什么症状| 庚五行属什么| 宝宝低烧是什么原因引起的| 梦见大风大雨预示什么| 敏五行属什么| 诸葛亮为什么气死周瑜| 腰肌劳损贴什么膏药| 毛囊是什么样子图片| 深圳为什么叫鹏城| pp1是什么意思| 镜检是什么| 排便方式改变指什么| 什么酒好喝| 耳朵嗡嗡响吃什么药| 心肌缺血吃什么中成药| 什么治便秘| 狗狗尾巴溃烂用什么药| 澳大利亚属于什么气候| 黑今念什么| OD是什么| 减脂早餐吃什么| 潜阳是什么意思| 脚扭了挂什么科| 会阴是什么部位| 为什么会一直咳嗽| 2003年属什么| 车厘子什么季节成熟| 梧桐树叶像什么| 嘴突然歪是什么原因造成的| 声音沙哑是什么原因| 尿出来很黄是什么原因| 黑卡是什么| 司马光和司马迁是什么关系| 背上长痘是什么原因| 女生下面什么味道| 紫苏叶有什么功效| 打日本电话前面加什么| 血脂高吃什么能降下来| 原则上是什么意思| 1983年五行属什么| 草莓印是什么意思| 暗送秋波什么意思| 科举制什么时候废除| 生蚝是什么东西| 2008属什么生肖| 门牙旁边的牙齿叫什么| etf什么意思| 鸡后面是什么生肖| 痛经什么原因| 左腿发麻是什么病征兆| silence是什么意思| 白左什么意思| 什么的同学| 93年的属什么| 127是什么意思| 牛皮癣是什么样的| 钙是什么| 过敏性鼻炎吃什么药好| 囟门凹陷是什么原因| h型高血压什么意思| 婴幼儿湿疹用什么药膏最有效| 止疼药吃多了有什么副作用| 喝水喝多了有什么坏处| 后背疼是什么原因| evisu是什么牌子中文| 跟腱炎什么症状| 百度Jump to content

红豆杉(股票代码430383)新三板上市最新公告列表

From Wikipedia, the free encyclopedia
百度   第三层次2000名,为35岁以下具有较大发展潜力的青年拔尖人才。

In the context of software engineering, software quality refers to two related but distinct notions:[citation needed]

  • Software's functional quality reflects how well it complies with or conforms to a given design, based on functional requirements or specifications.[1] That attribute can also be described as the fitness for the purpose of a piece of software or how it compares to competitors in the marketplace as a worthwhile product.[2] It is the degree to which the correct software was produced.
  • Software structural quality refers to how it meets non-functional requirements that support the delivery of the functional requirements, such as robustness or maintainability. It has a lot more to do with the degree to which the software works as needed.

Many aspects of structural quality can be evaluated only statically through the analysis of the software's inner structure, its source code (see Software metrics),[3] at the unit level, and at the system level (sometimes referred to as end-to-end testing[4]), which is in effect how its architecture adheres to sound principles of software architecture outlined in a paper on the topic by Object Management Group (OMG).[5]

Some structural qualities, such as usability, can be assessed only dynamically (users or others acting on their behalf interact with the software or, at least, some prototype or partial implementation; even the interaction with a mock version made in cardboard represents a dynamic test because such version can be considered a prototype). Other aspects, such as reliability, might involve not only the software but also the underlying hardware, therefore, it can be assessed both statically and dynamically (stress test).[citation needed]

Using automated tests and fitness functions can help to maintain some of the quality related attributes. [6]

Functional quality is typically assessed dynamically but it is also possible to use static tests (such as software reviews).[citation needed]

Historically, the structure, classification, and terminology of attributes and metrics applicable to software quality management have been derived or extracted from the ISO 9126 and the subsequent ISO/IEC 25000 standard.[7] Based on these models (see Models), the Consortium for IT Software Quality (CISQ) has defined five major desirable structural characteristics needed for a piece of software to provide business value:[8] Reliability, Efficiency, Security, Maintainability, and (adequate) Size.[9][10][11]

Software quality measurement quantifies to what extent a software program or system rates along each of these five dimensions. An aggregated measure of software quality can be computed through a qualitative or a quantitative scoring scheme or a mix of both and then a weighting system reflecting the priorities. This view of software quality being positioned on a linear continuum is supplemented by the analysis of "critical programming errors" that under specific circumstances can lead to catastrophic outages or performance degradations that make a given system unsuitable for use regardless of rating based on aggregated measurements. Such programming errors found at the system level represent up to 90 percent of production issues, whilst at the unit-level, even if far more numerous, programming errors account for less than 10 percent of production issues (see also Ninety–ninety rule). As a consequence, code quality without the context of the whole system, as W. Edwards Deming described it, has limited value.[citation needed]

To view, explore, analyze, and communicate software quality measurements, concepts and techniques of information visualization provide visual, interactive means useful, in particular, if several software quality measures have to be related to each other or to components of a software or system. For example, software maps represent a specialized approach that "can express and combine information about software development, software quality, and system dynamics".[12]

Software quality also plays a role in the release phase of a software project. Specifically, the quality and establishment of the release processes (also patch processes),[13][14] configuration management[15] are important parts of an overall software engineering process.[16][17][18]

Motivation

[edit]

Software quality is motivated by at least two main perspectives:

  • Risk management: Software failure has caused more than inconvenience. Software errors can cause human fatalities (see for example: List of software bugs). The causes have ranged from poorly designed user interfaces to direct programming errors,[19][20][21] see for example Boeing 737 case or Unintended acceleration cases[22][23] or Therac-25 cases.[24] This resulted in requirements for the development of some types of software, particularly and historically for software embedded in medical and other devices that regulate critical infrastructures: "[Engineers who write embedded software] see Java programs stalling for one third of a second to perform garbage collection and update the user interface, and they envision airplanes falling out of the sky.".[25] In the United States, within the Federal Aviation Administration (FAA), the FAA Aircraft Certification Service provides software programs, policy, guidance and training, focus on software and Complex Electronic Hardware that has an effect on the airborne product (a "product" is an aircraft, an engine, or a propeller).[26] Certification standards such as DO-178C, ISO 26262, IEC 62304, etc. provide guidance.
  • Cost management: As in any other fields of engineering, a software product or service governed by good software quality costs less to maintain, is easier to understand and can change more cost-effective in response to pressing business needs.[27] Industry data demonstrate that poor application structural quality in core business applications (such as enterprise resource planning (ERP), customer relationship management (CRM) or large transaction processing systems in financial services) results in cost, schedule overruns and creates waste in the form of rework (see Muda (Japanese term)).[28][29][30] Moreover, poor structural quality is strongly correlated with high-impact business disruptions due to corrupted data, application outages, security breaches, and performance problems.[31]
    • CISQ reports on the cost of poor quality estimates an impact of:
    • IBM's Cost of a Data Breach Report 2020 estimates that the average global costs of a data breach:[34][35]
      • $3.86 million

Definitions

[edit]

ISO

[edit]

Software quality is the "capability of a software product to conform to requirements."[36][37] while for others it can be synonymous with customer- or value-creation[38][39] or even defect level.[40] Software quality measurements can be split into three parts: process quality, product quality which includes internal and external properties and lastly, quality in use, which is the effect of the software.[41]

ASQ

[edit]

ASQ uses the following definition: Software quality describes the desirable attributes of software products. There are two main approaches exist: defect management and quality attributes.[42]

NIST

[edit]

Software Assurance (SA) covers both the property and the process to achieve it:[43]

  • [Justifiable] confidence that software is free from vulnerabilities, either intentionally designed into the software or accidentally inserted at any time during its life cycle and that the software functions in the intended manner
  • The planned and systematic set of activities that ensure that software life cycle processes and products conform to requirements, standards, and procedures

PMI

[edit]

The Project Management Institute's PMBOK Guide "Software Extension" defines not "Software quality" itself, but Software Quality Assurance (SQA) as "a continuous process that audits other software processes to ensure that those processes are being followed (includes for example a software quality management plan)." whereas Software Quality Control (SCQ) means "taking care of applying methods, tools, techniques to ensure satisfaction of the work products toward quality requirements for a software under development or modification."[44]

Other general and historic

[edit]

The first definition of quality in recorded history is from Shewhart in the beginning of 20th century: "There are two common aspects of quality: one of them has to do with the consideration of the quality of a thing as an objective reality independent of the existence of man. The other has to do with what we think, feel or sense as a result of the objective reality. In other words, there is a subjective side of quality."[45]

Kitchenham and Pfleeger, further reporting the teachings of David Garvin, identify five different perspectives on quality:[46][47]

  • The transcendental perspective deals with the metaphysical aspect of quality. In this view of quality, it is "something toward which we strive as an ideal, but may never implement completely".[48] It can hardly be defined, but is similar to what a federal judge once commented about obscenity: "I know it when I see it".[49]
  • The user perspective is concerned with the appropriateness of the product for a given context of use. Whereas the transcendental view is ethereal, the user view is more concrete, grounded in the product characteristics that meet user's needs.[48]
  • The manufacturing perspective represents quality as conformance to requirements. This aspect of quality is stressed by standards such as ISO 9001, which defines quality as "the degree to which a set of inherent characteristics fulfills requirements" (ISO/IEC 9001[50]).
  • The product perspective implies that quality can be appreciated by measuring the inherent characteristics of the product.
  • The final perspective of quality is value-based.[38] This perspective recognizes that the different perspectives of quality may have different importance, or value, to various stakeholders.

The problem inherent in attempts to define the quality of a product, almost any product, was stated by the master Walter A. Shewhart. The difficulty in defining quality is to translate the future needs of the user into measurable characteristics, so that a product can be designed and turned out to give satisfaction at a price that the user will pay. This is not easy, and as soon as one feels fairly successful in the endeavor, he finds that the needs of the consumer have changed, competitors have moved in, etc.[51]

Quality is a customer determination, not an engineer's determination, not a marketing determination, nor a general management determination. It is based on the customer's actual experience with the product or service, measured against his or her requirements -- stated or unstated, conscious or merely sensed, technically operational or entirely subjective -- and always representing a moving target in a competitive market.[52]

The word quality has multiple meanings. Two of these meanings dominate the use of the word: 1. Quality consists of those product features which meet the need of customers and thereby provide product satisfaction. 2. Quality consists of freedom from deficiencies. Nevertheless, in a handbook such as this it is convenient to standardize on a short definition of the word quality as "fitness for use".[53]

Tom DeMarco has proposed that "a product's quality is a function of how much it changes the world for the better."[citation needed] This can be interpreted as meaning that functional quality and user satisfaction are more important than structural quality in determining software quality.

Another definition, coined by Gerald Weinberg in Quality Software Management: Systems Thinking, is "Quality is value to some person."[54][55]

Other meanings and controversies

[edit]

One of the challenges in defining quality is that "everyone feels they understand it"[56] and other definitions of software quality could be based on extending the various descriptions of the concept of quality used in business.

Software quality also often gets mixed-up with Quality Assurance or Problem Resolution Management[57] or Quality Control[58] or DevOps. It does overlap with these areas (see also PMI definitions), but it is distinctive as it does not solely focus on testing but also on processes, management, improvements, assessments, etc.[58]

Measurement

[edit]

Although the concepts presented in this section are applicable to both structural and functional software quality, measurement of the latter is essentially performed through software testing.[59] Testing is not enough: According to one study, "individual programmers are less than 50% efficient at finding bugs in their own software. And most forms of testing are only 35% efficient. This makes it difficult to determine [software] quality."[60]

Introduction

[edit]
Relationship between software desirable characteristics (right) and measurable attributes (left)

Software quality measurement is about quantifying to what extent a system or software possesses desirable characteristics. This can be performed through qualitative or quantitative means or a mix of both. In both cases, for each desirable characteristic, there are a set of measurable attributes the existence of which in a piece of software or system tend to be correlated and associated with this characteristic. For example, an attribute associated with portability is the number of target-dependent statements in a program. More precisely, using the Quality Function Deployment approach, these measurable attributes are the "hows" that need to be enforced to enable the "whats" in the Software Quality definition above.

The structure, classification and terminology of attributes and metrics applicable to software quality management have been derived or extracted from the ISO 9126-3 and the subsequent ISO/IEC 25000:2005 quality model. The main focus is on internal structural quality. Subcategories have been created to handle specific areas like business application architecture and technical characteristics such as data access and manipulation or the notion of transactions.

The dependence tree between software quality characteristics and their measurable attributes is represented in the diagram on the right, where each of the 5 characteristics that matter for the user (right) or owner of the business system depends on measurable attributes (left):

  • Application Architecture Practices
  • Coding Practices
  • Application Complexity
  • Documentation
  • Portability
  • Technical and Functional Volume

Correlations between programming errors and production defects unveil that basic code errors account for 92 percent of the total errors in the source code. These numerous code-level issues eventually count for only 10 percent of the defects in production. Bad software engineering practices at the architecture levels account for only 8 percent of total defects, but consume over half the effort spent on fixing problems, and lead to 90 percent of the serious reliability, security, and efficiency issues in production.[61][62]

Code-based analysis

[edit]

Many of the existing software measures count structural elements of the application that result from parsing the source code for such individual instructions[63] tokens[64] control structures (Complexity), and objects.[65]

Software quality measurement is about quantifying to what extent a system or software rates along these dimensions. The analysis can be performed using a qualitative or quantitative approach or a mix of both to provide an aggregate view [using for example weighted average(s) that reflect relative importance between the factors being measured].

This view of software quality on a linear continuum has to be supplemented by the identification of discrete Critical Programming Errors. These vulnerabilities may not fail a test case, but they are the result of bad practices that under specific circumstances can lead to catastrophic outages, performance degradations, security breaches, corrupted data, and myriad other problems[66] that make a given system de facto unsuitable for use regardless of its rating based on aggregated measurements. A well-known example of vulnerability is the Common Weakness Enumeration,[67] a repository of vulnerabilities in the source code that make applications exposed to security breaches.

The measurement of critical application characteristics involves measuring structural attributes of the application's architecture, coding, and in-line documentation, as displayed in the picture above. Thus, each characteristic is affected by attributes at numerous levels of abstraction in the application and all of which must be included in calculating the characteristic's measure if it is to be a valuable predictor of quality outcomes that affect the business. The layered approach to calculating characteristic measures displayed in the figure above was first proposed by Boehm and his colleagues at TRW (Boehm, 1978)[68] and is the approach taken in the ISO 9126 and 25000 series standards. These attributes can be measured from the parsed results of a static analysis of the application source code. Even dynamic characteristics of applications such as reliability and performance efficiency have their causal roots in the static structure of the application.

Structural quality analysis and measurement is performed through the analysis of the source code, the architecture, software framework, database schema in relationship to principles and standards that together define the conceptual and logical architecture of a system. This is distinct from the basic, local, component-level code analysis typically performed by development tools which are mostly concerned with implementation considerations and are crucial during debugging and testing activities.

Reliability

[edit]

The root causes of poor reliability are found in a combination of non-compliance with good architectural and coding practices. This non-compliance can be detected by measuring the static quality attributes of an application. Assessing the static attributes underlying an application's reliability provides an estimate of the level of business risk and the likelihood of potential application failures and defects the application will experience when placed in operation.

Assessing reliability requires checks of at least the following software engineering best practices and technical attributes:

  • Application Architecture Practices
  • Coding Practices
  • Complexity of algorithms
  • Complexity of programming practices
  • Compliance with Object-Oriented and Structured Programming best practices (when applicable)
  • Component or pattern re-use ratio
  • Dirty programming
  • Error & Exception handling (for all layers - GUI, Logic & Data)
  • Multi-layer design compliance
  • Resource bounds management
  • Software avoids patterns that will lead to unexpected behaviors
  • Software manages data integrity and consistency
  • Transaction complexity level

Depending on the application architecture and the third-party components used (such as external libraries or frameworks), custom checks should be defined along the lines drawn by the above list of best practices to ensure a better assessment of the reliability of the delivered software.

Efficiency

[edit]

As with Reliability, the causes of performance inefficiency are often found in violations of good architectural and coding practice which can be detected by measuring the static quality attributes of an application. These static attributes predict potential operational performance bottlenecks and future scalability problems, especially for applications requiring high execution speed for handling complex algorithms or huge volumes of data.

Assessing performance efficiency requires checking at least the following software engineering best practices and technical attributes:

  • Application Architecture Practices
  • Appropriate interactions with expensive and/or remote resources
  • Data access performance and data management
  • Memory, network and disk space management
  • Compliance with Coding Practices[69] (Best coding practices)

Security

[edit]

Software quality includes software security.[70] Many security vulnerabilities result from poor coding and architectural practices such as SQL injection or cross-site scripting.[71][72] These are well documented in lists maintained by CWE,[73] and the SEI/Computer Emergency Center (CERT) at Carnegie Mellon University.[69]

Assessing security requires at least checking the following software engineering best practices and technical attributes:

  • Implementation, Management of a security-aware and hardening development process, e.g. Security Development Lifecycle (Microsoft) or IBM's Secure Engineering Framework.[74]
  • Secure Application Architecture Practices[75][76]
  • Multi-layer design compliance
  • Security best practices (Input Validation, SQL Injection, Cross-Site Scripting, Access control etc.)[77][78]
  • Secure and good Programming Practices[69]
  • Error & Exception handling

Maintainability

[edit]

Maintainability includes concepts of modularity, understandability, changeability, testability, reusability, and transferability from one development team to another. These do not take the form of critical issues at the code level. Rather, poor maintainability is typically the result of thousands of minor violations with best practices in documentation, complexity avoidance strategy, and basic programming practices that make the difference between clean and easy-to-read code vs. unorganized and difficult-to-read code.[79]

Assessing maintainability requires checking the following software engineering best practices and technical attributes:

  • Application Architecture Practices
  • Architecture, Programs and Code documentation embedded in source code
  • Code readability
  • Code smells
  • Complexity level of transactions
  • Complexity of algorithms
  • Complexity of programming practices
  • Compliance with Object-Oriented and Structured Programming best practices (when applicable)
  • Component or pattern re-use ratio
  • Controlled level of dynamic coding
  • Coupling ratio
  • Dirty programming
  • Documentation
  • Hardware, OS, middleware, software components and database independence
  • Multi-layer design compliance
  • Portability
  • Programming Practices (code level)
  • Reduced duplicate code and functions
  • Source code file organization cleanliness

Maintainability is closely related to Ward Cunningham's concept of technical debt, which is an expression of the costs resulting of a lack of maintainability. Reasons for why maintainability is low can be classified as reckless vs. prudent and deliberate vs. inadvertent,[80][81] and often have their origin in developers' inability, lack of time and goals, their carelessness and discrepancies in the creation cost of and benefits from documentation and, in particular, maintainable source code.[82]

Size

[edit]

Measuring software size requires that the whole source code be correctly gathered, including database structure scripts, data manipulation source code, component headers, configuration files etc. There are essentially two types of software sizes to be measured, the technical size (footprint) and the functional size:

  • There are several software technical sizing methods that have been widely described. The most common technical sizing method is number of Lines of Code (#LOC) per technology, number of files, functions, classes, tables, etc., from which backfiring Function Points can be computed;
  • The most common for measuring functional size is function point analysis. Function point analysis measures the size of the software deliverable from a user's perspective. Function point sizing is done based on user requirements and provides an accurate representation of both size for the developer/estimator and value (functionality to be delivered) and reflects the business functionality being delivered to the customer. The method includes the identification and weighting of user recognizable inputs, outputs and data stores. The size value is then available for use in conjunction with numerous measures to quantify and to evaluate software delivery and performance (development cost per function point; delivered defects per function point; function points per staff month.).

The function point analysis sizing standard is supported by the International Function Point Users Group (IFPUG). It can be applied early in the software development life-cycle and it is not dependent on lines of code like the somewhat inaccurate Backfiring method. The method is technology agnostic and can be used for comparative analysis across organizations and across industries.

Since the inception of Function Point Analysis, several variations have evolved and the family of functional sizing techniques has broadened to include such sizing measures as COSMIC, NESMA, Use Case Points, FP Lite, Early and Quick FPs, and most recently Story Points. Function Point has a history of statistical accuracy, and has been used as a common unit of work measurement in numerous application development management (ADM) or outsourcing engagements, serving as the "currency" by which services are delivered and performance is measured.

One common limitation to the Function Point methodology is that it is a manual process and therefore it can be labor-intensive and costly in large scale initiatives such as application development or outsourcing engagements. This negative aspect of applying the methodology may be what motivated industry IT leaders to form the Consortium for IT Software Quality focused on introducing a computable metrics standard for automating the measuring of software size while the IFPUG keep promoting a manual approach as most of its activity rely on FP counters certifications.

CISQ defines Sizing as to estimate the size of software to support cost estimating, progress tracking or other related software project management activities. Two standards are used: Automated Function Points to measure the functional size of software and Automated Enhancement Points to measure the size of both functional and non-functional code in one measure.[83]

Identifying critical programming errors

[edit]

Critical Programming Errors are specific architectural and/or coding bad practices that result in the highest, immediate or long term, business disruption risk.[84]

These are quite often technology-related and depend heavily on the context, business objectives and risks. Some may consider respect for naming conventions while others – those preparing the ground for a knowledge transfer for example – will consider it as absolutely critical.

Critical Programming Errors can also be classified per CISQ Characteristics. Basic example below:

  • Reliability
    • Avoid software patterns that will lead to unexpected behavior (Uninitialized variable, null pointers, etc.)
    • Methods, procedures and functions doing Insert, Update, Delete, Create Table or Select must include error management
    • Multi-thread functions should be made thread safe, for instance servlets or struts action classes must not have instance/non-final static fields
  • Efficiency
    • Ensure centralization of client requests (incoming and data) to reduce network traffic
    • Avoid SQL queries that don't use an index against large tables in a loop
  • Security
    • Avoid fields in servlet classes that are not final static
    • Avoid data access without including error management
    • Check control return codes and implement error handling mechanisms
    • Ensure input validation to avoid cross-site scripting flaws or SQL injections flaws
  • Maintainability
    • Deep inheritance trees and nesting should be avoided to improve comprehensibility
    • Modules should be loosely coupled (fanout, intermediaries) to avoid propagation of modifications
    • Enforce homogeneous naming conventions

Operationalized quality models

[edit]

Newer proposals for quality models such as Squale and Quamoco[85] propagate a direct integration of the definition of quality attributes and measurement. By breaking down quality attributes or even defining additional layers, the complex, abstract quality attributes (such as reliability or maintainability) become more manageable and measurable. Those quality models have been applied in industrial contexts but have not received widespread adoption.

See also

[edit]

Further reading

[edit]
  • Android OS Quality Guidelines including checklists for UI, Security, etc. July 2021
  • Association of Maritime Managers in Information Technology & Communications (AMMITEC). Maritime Software Quality Guidelines. September 2017
  • Capers Jones and Olivier Bonsignour, "The Economics of Software Quality", Addison-Wesley Professional, 1st edition, December 31, 2011, ISBN 978-0-13-258220-9
  • CAT Lab - CNES Code Analysis Tools Laboratory (on GitHub)
  • Girish Suryanarayana, Software Process versus Design Quality: Tug of War?[86]
  • Ho-Won Jung, Seung-Gweon Kim, and Chang-Sin Chung. Measuring software product quality: A survey of ISO/IEC 9126. IEEE Software, 21(5):10–13, September/October 2004.
  • International Organization for Standardization. Software Engineering—Product Quality—Part 1: Quality Model. ISO, Geneva, Switzerland, 2001. ISO/IEC 9126-1:2001(E).
  • Measuring Software Product Quality: the ISO 25000 Series and CMMI (SEI site)
  • MSQF - A measurement based software quality framework Cornell University Library
  • Omar Alshathry, Helge Janicke, "Optimizing Software Quality Assurance," compsacw, pp. 87–92, 2010 IEEE 34th Annual Computer Software and Applications Conference Workshops, 2010.
  • Robert L. Glass. Building Quality Software. Prentice Hall, Upper Saddle River, NJ, 1992.
  • Roland Petrasch, "The Definition of 'Software Quality': A Practical Approach", ISSRE, 1999
  • Software Quality Professional,[87] American Society for Quality (ASQ)
  • Software Quality Journal[88] by Springer Nature
  • Spinellis, Diomidis (2025-08-06). Code quality: the open source perspective. Upper Saddle River, New Jersey, US: Addison-Wesley Professional. ISBN 978-0-321-16607-4.
  • Stephen H. Kan. Metrics and Models in Software Quality Engineering. Addison-Wesley, Boston, MA, second edition, 2002.
  • Stefan Wagner. Software Product Quality Control. Springer, 2013.

References

[edit]

Notes

  1. ^ "Learning from history: The case of Software Requirements Engineering – Requirements Engineering Magazine". Learning from history: The case of Software Requirements Engineering – Requirements Engineering Magazine. Retrieved 2025-08-06.
  2. ^ Pressman, Roger S. (2005). Software Engineering: A Practitioner's Approach (Sixth International ed.). McGraw-Hill Education. p. 388. ISBN 0071267824.
  3. ^ "About the Automated Source Code Quality Measures Specification Version 1.0". www.omg.org. Retrieved 2025-08-06.
  4. ^ "How to Perform End-to-End Testing". smartbear.com. Retrieved 2025-08-06.
  5. ^ "How to Deliver Resilient, Secure, Efficient, and Easily Changed IT Systems in Line with CISQ Recommendations" (PDF). Archived (PDF) from the original on 2025-08-06. Retrieved 2025-08-06.
  6. ^ Fundamentals of Software Architecture: An Engineering Approach. O'Reilly Media. 2020. ISBN 978-1492043454.
  7. ^ "ISO/IEC 25010:2011". ISO. Retrieved 2025-08-06.
  8. ^ Armour, Phillip G. (2025-08-06). "A measure of control". Communications of the ACM. 55 (6): 26–28. doi:10.1145/2184319.2184329. ISSN 0001-0782. S2CID 6059054.
  9. ^ Voas, J. (November 2011). "Software's secret sauce: the "-ilities" [software quality]". IEEE Software. 21 (6): 14–15. doi:10.1109/MS.2004.54. ISSN 1937-4194.
  10. ^ "Code Quality Standards | CISQ - Consortium for Information & Software Quality". www.it-cisq.org. Retrieved 2025-08-06.
  11. ^ "Software Sizing Standards | CISQ - Consortium for Information & Software Quality". www.it-cisq.org. Retrieved 2025-08-06.
  12. ^ J. Bohnet, J. D?llner Archived 2025-08-06 at the Wayback Machine, "Monitoring Code Quality and Development Activity by Software Maps". Proceedings of the IEEE ACM ICSE Workshop on Managing Technical Debt, pp. 9-16, 2011.
  13. ^ "IIA - Global Technology Audit Guide: IT Change Management: Critical for Organizational Success". na.theiia.org. Retrieved 2025-08-06.
  14. ^ Boursier, Jér?me (2025-08-06). "Meltdown and Spectre fallout: patching problems persist". Malwarebytes Labs. Retrieved 2025-08-06.
  15. ^ "Best practices for software updates - Configuration Manager". docs.microsoft.com. Retrieved 2025-08-06.
  16. ^ Wright, Hyrum K. (2025-08-06). "Release engineering processes, models, and metrics". Proceedings of the doctoral symposium for ESEC/FSE on Doctoral symposium. ESEC/FSE Doctoral Symposium '09. Amsterdam, the Netherlands: Association for Computing Machinery. pp. 27–28. doi:10.1145/1595782.1595793. ISBN 978-1-60558-731-8. S2CID 10483918.
  17. ^ van der Hoek, André; Hall, Richard S.; Heimbigner, Dennis; Wolf, Alexander L. (November 1997). "Software release management". ACM SIGSOFT Software Engineering Notes. 22 (6): 159–175. doi:10.1145/267896.267909. ISSN 0163-5948.
  18. ^ Sutton, Mike; Moore, Tym (2025-08-06). "7 Ways to Improve Your Software Release Management". CIO. Retrieved 2025-08-06.
  19. ^ Clark, Mitchell (2025-08-06). "iRobot says it'll be a few weeks until it can clean up its latest Roomba software update mess". The Verge. Retrieved 2025-08-06.
  20. ^ "Top 25 Software Errors". www.sans.org. Retrieved 2025-08-06.
  21. ^ "'Turn it Off and On Again Every 149 Hours' Is a Concerning Remedy for a $300 Million Airbus Plane's Software Bug". Gizmodo. 30 July 2019. Retrieved 2025-08-06.
  22. ^ "MISRA C, Toyota and the Death of Task X". Retrieved 2025-08-06.
  23. ^ "An Update on Toyota and Unintended Acceleration ? Barr Code". embeddedgurus.com. Retrieved 2025-08-06.
  24. ^ Medical Devices: The Therac-25* Archived 2025-08-06 at the Wayback Machine, Nancy Leveson, University of Washington
  25. ^ Embedded Software Archived 2025-08-06 at the Wayback Machine, Edward A. Lee, To appear in Advances in Computers (Marvin Victor Zelkowitz, editor), Vol. 56, Academic Press, London, 2002, Revised from UCB ERL Memorandum M01/26 University of California, Berkeley, CA 94720, USA, November 1, 2001
  26. ^ "Aircraft Certification Software and Airborne Electronic Hardware". Archived from the original on 4 October 2014. Retrieved 28 September 2014.
  27. ^ "The Cost of Poor Software Quality in the US: A 2020 Report | CISQ - Consortium for Information & Software Quality". www.it-cisq.org. Retrieved 2025-08-06.
  28. ^ "What is Waste? | Agile Alliance". Agile Alliance |. 20 April 2016. Retrieved 2025-08-06.
  29. ^ Matteson, Scott (January 26, 2018). "Report: Software failure caused $1.7 trillion in financial losses in 2017". TechRepublic. Retrieved 2025-08-06.
  30. ^ Cohane, Ryan (2025-08-06). "Financial Cost of Software Bugs". Medium. Retrieved 2025-08-06.
  31. ^ Eloff, Jan; Bella, Madeleine Bihina (2018), "Software Failures: An Overview", Software Failure Investigation, Cham: Springer International Publishing, pp. 7–24, doi:10.1007/978-3-319-61334-5_2, ISBN 978-3-319-61333-8, retrieved 2025-08-06
  32. ^ "Poor software quality cost businesses $2 trillion last year and put security at risk". CIO Dive. Retrieved 2025-08-06.
  33. ^ "Synopsys-Sponsored CISQ Research Estimates Cost of Poor Software Quality in the US $2.08 Trillion in 2020". finance.yahoo.com. Retrieved 2025-08-06.
  34. ^ "What Does a Data Breach Cost in 2020?". Digital Guardian. 2025-08-06. Retrieved 2025-08-06.
  35. ^ "Cost of a Data Breach Report 2020 | IBM". www.ibm.com. 2020. Retrieved 2025-08-06.
  36. ^ "ISO - ISO 9000 family — Quality management". ISO. Retrieved 2025-08-06.
  37. ^ "ISO/IEC/IEEE 24765:2017". ISO. Retrieved 2025-08-06.
  38. ^ a b "Mastering automotive software". www.mckinsey.com. Retrieved 2025-08-06.
  39. ^ "ISO/IEC 25010:2011". ISO. Retrieved 2025-08-06.
  40. ^ Wallace, D.R. (2002). "Practical software reliability modeling". Proceedings 26th Annual NASA Goddard Software Engineering Workshop. Greenbelt, MD, USA: IEEE Comput. Soc. pp. 147–155. doi:10.1109/SEW.2001.992668. ISBN 978-0-7695-1456-7. S2CID 57382117.
  41. ^ "ISO/IEC 25023:2016". ISO. Retrieved 2025-08-06.
  42. ^ "What is Software Quality? | ASQ". asq.org. Retrieved 2025-08-06.
  43. ^ "SAMATE - Software Assurance Metrics And Tool Evaluation project main page". NIST. 3 February 2021. Retrieved 2025-08-06.
  44. ^ Software extension to the PMBOK guide. Project Management Institute (5th ed.). Newtown Square, Pennsylvania. 2013. ISBN 978-1-62825-041-1. OCLC 959513383.{{cite book}}: CS1 maint: location missing publisher (link) CS1 maint: others (link)
  45. ^ Shewart, Walter A. (2015). Economioc control of quality of manufactured product. [Place of publication not identified]: Martino Fine Books. ISBN 978-1-61427-811-5. OCLC 1108913766.
  46. ^ Kitchenham, B.; Pfleeger, S. L. (January 1996). "Software quality: the elusive target [special issues section]". IEEE Software. 13 (1): 12–21. doi:10.1109/52.476281. ISSN 1937-4194.
  47. ^ Garvin, David A. (1988). Managing quality : the strategic and competitive edge. New York: Free Press. ISBN 0-02-911380-6. OCLC 16005388.
  48. ^ a b B. Kitchenham and S. Pfleeger, "Software quality: the elusive target", IEEE Software, vol. 13, no. 1, pp. 12–21, 1996.
  49. ^ Kan, Stephen H. (2003). Metrics and models in software quality engineering (2nd ed.). Boston: Addison-Wesley. ISBN 0-201-72915-6. OCLC 50149641.
  50. ^ International Organization for Standardization, "ISO/IEC 9001: Quality management systems -- Requirements," 1999.
  51. ^ W. E. Deming, "Out of the crisis: quality, productivity and competitive position". Cambridge University Press, 1988.
  52. ^ A. V. Feigenbaum, "Total Quality Control", McGraw-Hill, 1983.
  53. ^ J.M. Juran, "Juran's Quality Control Handbook", McGraw-Hill, 1988.
  54. ^ Weinberg, Gerald M. (1991). Quality software management: Volume 1, Systems Thinking. New York, N.Y.: Dorset House. ISBN 0-932633-22-6. OCLC 23870230.
  55. ^ Weinberg, Gerald M. (1993). Quality software management: Volume 2, First-Order Measurement. New York, N.Y.: Dorset House. ISBN 0-932633-22-6. OCLC 23870230.
  56. ^ Crosby, P., Quality is Free, McGraw-Hill, 1979
  57. ^ "SUP.9 – Problem Resolution Management - Kugler Maag Cie". www.kuglermaag.com. Retrieved 2025-08-06.
  58. ^ a b Hoipt (2025-08-06). "Organizations often use the terms 'Quality Assurance' (QA) vs 'Quality Control' (QC)…". Medium. Retrieved 2025-08-06.
  59. ^ Wallace, D.; Watson, A. H.; Mccabe, T. J. (2025-08-06). "Structured Testing: A Testing Methodology Using the Cyclomatic Complexity Metric". NIST.
  60. ^ Bellairs, Richard. "What Is Code Quality? And How to Improve Code Quality". Perforce Software. Retrieved 2025-08-06.
  61. ^ "OMG Whitepaper | CISQ - Consortium for Information & Software Quality". www.it-cisq.org. Retrieved 2025-08-06.
  62. ^ "How to Deliver Resilient, Secure, Efficient and Agile IT Systems in Line with CISQ Recommendations - Whitepaper | Object Management Group" (PDF). Archived (PDF) from the original on 2025-08-06. Retrieved 2025-08-06.
  63. ^ "Software Size Measurement: A Framework for Counting Source Statements". resources.sei.cmu.edu. 31 August 1992. Retrieved 2025-08-06.
  64. ^ Halstead, Maurice H. (1977). Elements of Software Science (Operating and programming systems series). USA: Elsevier Science Inc. ISBN 978-0-444-00205-1.
  65. ^ Chidamber, S. R.; Kemerer, C. F. (June 1994). "A metrics suite for object oriented design". IEEE Transactions on Software Engineering. 20 (6): 476–493. doi:10.1109/32.295895. hdl:1721.1/48424. ISSN 1939-3520. S2CID 9493847.
  66. ^ Nygard, Michael (2007). Release It!. an O'Reilly Media Company Safari (1st ed.). Pragmatic Bookshelf. ISBN 978-0978739218. OCLC 1102387436.
  67. ^ "CWE - Common Weakness Enumeration". cwe.mitre.org. Archived from the original on 2025-08-06. Retrieved 2025-08-06.
  68. ^ Boehm, B., Brown, J.R., Kaspar, H., Lipow, M., MacLeod, G.J., & Merritt, M.J. (1978). Characteristics of Software Quality. North-Holland.
  69. ^ a b c "SEI CERT Coding Standards - CERT Secure Coding - Confluence". wiki.sei.cmu.edu. Retrieved 2025-08-06.
  70. ^ "Code quality and code security: How are they related? | Synopsys". Software Integrity Blog. 2025-08-06. Retrieved 2025-08-06.
  71. ^ "Cost of a Data Breach Report 2020 | IBM". www.ibm.com. 2020. Retrieved 2025-08-06.
  72. ^ "Key Takeaways from the 2020 Cost of a Data Breach Report". Bluefin. 2025-08-06. Retrieved 2025-08-06.
  73. ^ "CWE - Common Weakness Enumeration". Cwe.mitre.org. Archived from the original on 2025-08-06. Retrieved 2025-08-06.
  74. ^ Security in Development: The IBM Secure Engineering Framework | IBM Redbooks. 2025-08-06.
  75. ^ Enterprise Security Architecture Using IBM Tivoli Security Solutions | IBM Redbooks. 2025-08-06.
  76. ^ "Secure Architecture Design Definitions | CISA". us-cert.cisa.gov. Retrieved 2025-08-06.
  77. ^ "OWASP Foundation | Open Source Foundation for Application Security". owasp.org. Retrieved 2025-08-06.
  78. ^ "CWE's Top 25". Sans.org. Retrieved 2025-08-06.
  79. ^ IfSQ Level-2 A Foundation-Level Standard for Computer Program Source Code Archived 2025-08-06 at the Wayback Machine, Second Edition August 2008, Graham Bolton, Stuart Johnston, IfSQ, Institute for Software Quality.
  80. ^ Fowler, Martin (October 14, 2009). "TechnicalDebtQuadrant". Archived from the original on February 2, 2013. Retrieved February 4, 2013.
  81. ^ "Code quality: a concern for businesses, bottom lines, and empathetic programmers". Stack Overflow. 2025-08-06. Retrieved 2025-08-06.
  82. ^ Prause, Christian; Durdik, Zoya (June 3, 2012). "Architectural design and documentation: Waste in agile development?". 2012 International Conference on Software and System Process (ICSSP). IEEE Computer Society. pp. 130–134. doi:10.1109/ICSSP.2012.6225956. ISBN 978-1-4673-2352-9. S2CID 15216552.
  83. ^ "Software Sizing Standards | CISQ - Consortium for Information & Software Quality". www.it-cisq.org. Retrieved 2025-08-06.
  84. ^ "Why Software fails". IEEE Spectrum: Technology, Engineering, and Science News. 2 September 2005. Retrieved 2025-08-06.
  85. ^ Wagner, Stefan; Goeb, Andreas; Heinemann, Lars; Kl?s, Michael; Lampasona, Constanza; Lochmann, Klaus; Mayr, Alois; Pl?sch, Reinhold; Seidl, Andreas (2015). "Operationalised product quality models and assessment: The Quamoco approach" (PDF). Information and Software Technology. 62: 101–123. arXiv:1611.09230. doi:10.1016/j.infsof.2015.02.009. S2CID 10992384.
  86. ^ Suryanarayana, Girish (2015). "Software Process versus Design Quality: Tug of War?". IEEE Software. 32 (4): 7–11. doi:10.1109/MS.2015.87. S2CID 9226051.
  87. ^ "Software Quality Professional | ASQ". asq.org. Retrieved 2025-08-06.
  88. ^ "Software Quality Journal". Springer. Retrieved 2025-08-06.

Bibliography

  • Albrecht, A. J. (1979), Measuring application development productivity. In Proceedings of the Joint SHARE/GUIDE IBM Applications Development Symposium., IBM
  • Ben-Menachem, M.; Marliss, G. S. (1997), Software Quality, Producing Practical and Consistent Software, Thomson Computer Press
  • Boehm, B.; Brown, J.R.; Kaspar, H.; Lipow, M.; MacLeod, G.J.; Merritt, M.J. (1978), Characteristics of Software Quality, North-Holland.
  • Chidamber, S.; Kemerer, C. (1994), A Metrics Suite for Object Oriented Design. IEEE Transactions on Software Engineering, 20 (6), pp. 476–493
  • Ebert, Christof; Dumke, Reiner, Software Measurement: Establish - Extract - Evaluate - Execute, Kindle Edition, p. 91
  • Garmus, D.; Herron, D. (2001), Function Point Analysis, Addison Wesley
  • Halstead, M.E. (1977), Elements of Software Science, Elsevier North-Holland
  • Hamill, M.; Goseva-Popstojanova, K. (2009), Common faults in software fault and failure data. IEEE Transactions of Software Engineering, 35 (4), pp. 484–496
  • Jackson, D.J. (2009), A direct path to dependable software. Communications of the ACM, 52 (4).
  • Martin, R. (2001), Managing vulnerabilities in networked systems, IEEE Computer.
  • McCabe, T. (December 1976), A complexity measure. IEEE Transactions on Software Engineering
  • McConnell, Steve (1993), Code Complete (First ed.), Microsoft Press
  • Nygard, M.T. (2007), Release It! Design and Deploy Production Ready Software, The Pragmatic Programmers.
  • Park, R.E. (1992), Software Size Measurement: A Framework for Counting Source Statements. (CMU/SEI-92-TR-020)., Software Engineering Institute, Carnegie Mellon University
  • Pressman, Roger S. (2005). Software Engineering: A Practitioner's Approach (Sixth International ed.). McGraw-Hill Education. ISBN 0071267824.
  • Spinellis, D. (2006), Code Quality, Addison Wesley
[edit]
感冒头疼吃什么药 孽债是什么意思 女人味是什么 夏至什么时候 桔子树用什么肥料最好
和珅属什么生肖 为什么会被限制高消费 为什么会甲状腺肿大 赖氨酸有什么作用 海参为什么越小越贵
蜜蜡脱毛有什么危害吗 口甘读什么 吃什么排铅最快 风雨交加是什么生肖 louis是什么意思
gu是什么品牌 吃什么有助睡眠 做梦被打了是什么意思 什么叫k线 林俊杰为什么不结婚
倩字五行属什么wmyky.com 荨麻疹用什么药膏hcv8jop5ns9r.cn 下巴长痘是为什么hcv7jop6ns5r.cn 樊胜美是什么电视剧hcv8jop1ns4r.cn 红楼梦为什么是四大名著之首hcv9jop4ns1r.cn
女人为什么喜欢坏男人hcv8jop3ns7r.cn 驰骋沙场百战威是什么生肖hcv9jop3ns7r.cn 什么不可什么hcv9jop3ns1r.cn 长白班什么意思hcv9jop8ns3r.cn 霍金什么时候去世的aiwuzhiyu.com
反流性食管炎b级是什么意思hcv7jop6ns1r.cn 空调出的水是什么水hcv9jop7ns1r.cn 尿酸高可以吃什么hcv9jop4ns3r.cn 羊和什么生肖最配bysq.com 小姑独处是什么意思bfb118.com
宝宝反复发烧是什么原因引起的hcv9jop0ns0r.cn 热敷肚子有什么好处hcv9jop0ns1r.cn 头发出油是什么原因hcv8jop4ns5r.cn 甲亢挂什么科室hcv7jop9ns5r.cn 导管是什么意思fenrenren.com
百度