The Software Certification Process Essay (Critical Writing)

Exclusively available on IvyPanda Available only on IvyPanda

Introduction

Certification mostly refers to confirmation of a system or an object using some specific characteristics. In the case of software certification, the software is tested for characteristics it has; including performance and security characteristics of the applications (Alhazmi, Malaiya & Ray 2006, p. 220). This report looks into software certification and the vulnerability of software to attack by malicious programs or hackers and introduces models that are used to evaluate the attacks and protect software from vulnerability and risk of an attack.

We will write a custom essay on your topic a custom Critical Writing on The Software Certification Process
808 writers online

Maximum protection or a considerable protection of data and the information that is stored in the systems could be attained as a result of single point protection and out bound traffic for all website applications (Hatcliff et al. 2009, p. 13). Similarly, specific minimum requirements of software should be able to have protection against common attacks. It should be applicable to help curb outbound data theft within the networks (Hatcliff et al. 2009, p. 12). It is also important that the software is able to offer website cloaking and be able to offer protection to all information stored within the systems (Hatcliff et al. 2009, p. 13). There should be a secure HTTP trafficking, which could ensure that the information transferred cannot be easily intercepted; it also offers firewall services to such data that is being uploaded. In addition, the applications or the security procedures should ensure that there is protection for websites and the website applications that are stored in layer attacks (Hatcliff et al. 2009, p. 13). It is also important that the applications offer reports on the attacks and the attackers infecting the application. In order to have a comprehensive protection of a website, there should be comprehensive coding of an application to ensure there is integration of maximum protection of data (Alhazmi, Malaiya & Ray 2006, p. 220).

Main body

Depending on the nature of the software, there are chances of it being attacked by internet hackers and thus loosing the integrity of maintaining privacy of information (Alhazmi, Malaiya & Ray 2006, p. 220). In the process of certifying software, the personnel look at how the software is able to handle such attacks in order to offer protection to the information stored or processed; determines the quality of the software (Alhazmi, Malaiya & Ray 2006, p. 220).

Therefore, there should be a quality assurance management process for testing the security integrity of software. Quality assurance personnel act as middlemen to information technology. Quality assurance can be graded according to who is performing the test. For example, a developer can test the codes of information that are sent or the technicians that work with the automated assessment resources to venture in the world of information technology (Hatcliff et al. 2009, p. 12). The certification process on the quality of software is determined by specialized vendors that ensure that the software supplied to the users meets minimum requirements to ensure that information is secure (Hatcliff et al. 2009, p. 12). However, despite the need for improvement of security that is needed to securer software integrity; the certifying companies and organization are compromising the quality of the software. The laxities among the operators on the security of organizations and the certifying personnel have allowed vulnerable software applications find their way to the users.

In the case of multifaceted attacks, the applications are usually at the highest risk of attack by data fraud and hackers (Hatcliff et al 2009, p. 12). Data hacking and loosing from compromised applications poses a threat to damaging the reputation of a company and the developers of the software. For example, companies that undertake online operations such as online money transfer face greatest challenges in maintaining security in their transactions as a result of faulty operating systems (Hatcliff et al. 2009, p. 15). Such companies create firewalls, which are essential in curbing cyber crime. In a banking industry, there is need to ensure that systems used are secure bearing in mind all the sensitive activities that are involved in the process and the vulnerability included in such transactions (Hatcliff et al. 2009, p. 15). Application of secure applications is the most important investment in such industries and they should always be maintained at all times to ensure that they are risk proof.

There is paramount importance for software certification. Other than determining the security quality of the software it is important to equip the users with the appropriate information they need when using the software. Training on software provides the creators with a scorecard that will help them determine flaws in the applications. In addition, certification of the programs and software help creators of the program to be able to market more during certification (Andriole 1986, p. 95); as a result of the curiosity created during the training. Studies indicate that software certification is different from any other type of certification and thus it must be based on the process of development and the performance of the product. The process of certification in this case certifies the methodology of processing rather than the artifact which is the software (Andriole 1986, p. 95).

To come up with the best software, the certification should start at the beginning of processing where there should be correlation between the processor and the certifying agencies (Andriole 1986, p. 96). However, certification does not occur until the end of the process. During processing, it is upon the developer to provide the plan to the government agencies that are involved with certification processes. Thereafter, follows discussion on matters arising from software processing and development (Andriole 1986, p. 96). As processing commences there should be a continuous documentation of all observations in all lifecycles of production. The safety of the application is the paramount target item that the agencies look out for as they carry out the certification (Andriole 1986, p. 96). The inspecting authorities will be in the lookout for more security cases that may impede the certification of the software. When all queries are responded to, the government or the inspecting agencies will authorize a certification and the software will be licensed to be used by organizations, businesses or other personal individuals (Andriole 1986, p. 96).

1 hour!
The minimum time our certified writers need to deliver a 100% original paper

Despite the fact that the process of certification is meant to offer assurance to the user on matters that are concerned with safety of the software, the process does not meet its intended purpose. Most of the manufacturers are using certification to avoid legal and moral obligations associated with the manufacturing of the software (Andriole 1986, p. 97). In addition, certification is not able to determine if the system under which the software was created was correct or it was compromised; compromised systems will create faulty software. Faulty software is reported to have the highest rate of vulnerability.

Research indicates that vulnerabilities of software and applications can be predicted. Studies that came up with these findings used information obtained from databases owned by internet servers such as Microsoft, Apache foundations and Apache web servers. The process of software vulnerability assessment helps software developers to have a clear picture of the reality in the market as they create codes for software (Joh & Malaiya 2007, p. 430). Although having completely flawless software is not realistic, developers could make use of tools that are able to give maximum concentration on the most affected codes used in software processing (Joh & Malaiya 2007, p. 430).

Research further indicates that there is increased tendency for chances of information breaching in companies rolling out new software. Such companies do not offer enough information and demonstrations that could help them determine the vulnerabilities of the software (Sahinoglu 2008, p. 154). Since new software comes with the required manuals and application directions, dealing with them becomes very exhausting. Since the applications are new and cannot be upgraded due to their incompatibility; users can use knowledge and management strategies that are involved in running information (Sahinoglu 2008, p. 154).

In addition, research helps understand that it is possible for developers to design more competent software that fits the needs of the market (Talib, Khelifi & Jololian 2010, p. 90). The engineers working on the software should observe and eliminate all vulnerabilities that face it before fully dispatching it to the users. Most engineers have a clue of the vulnerabilities of products they introduced into the market (Talib, Khelifi & Jololian 2010, p. 90). Using the marker of the source-code, the engineers can be able to track all risky applications and programs that are targeted to affect the functioning of a program (Talib, Khelifi & Jololian 2010, p. 90). Despite the fact that the programs involved in the analysis of software are continually reviewed, they are not fully accredited to offer the best results for the analysis. This has made the programs less reliable to companies, therefore resulting into a breach of contract to many companies that fully depend on the software. Historically, companies have found themselves in the same complications that challenge the relevance of the applications. The difficulty of the applications is as a result of the employees’ committing perennial mistakes that are as a result of their continued mistakes during handling of programs that are used in analysis of the software (Talib, Khelifi & Jololian 2010, p. 91).

To come up with a comprehensive analysis of software, there are metrics used in the process including internal and external metrics (Joh & Malaiya 2007, p. 433). Metrics are important tools in prediction of systems faults and failures in performance especially the early life-cycle of software processing (Joh & Malaiya 2007, p. 434). Despite the metrics, there are tools that need to be used when coming up with the appropriate measure against infringement of the applications security and how often it can be applied in software production. The tools developed are important to companies since they help them evade attack by hackers and other malicious programs that exist in the internet. The reasons that cause increase in vulnerabilities, assert that there are more defects than vulnerabilities cases involved in the process (Kim, Malaiya & Ray 2007, p. 147). Other than creating more suitable tools to help fight the increased cases of vulnerabilities and flaws in the software developers, these tools are involved in research. There are different processes of hacking into software and bringing down, including those targeting the most effective firewalls. Research has helped developers produce the most effective software that overcomes security savvy that helps reduce the number of risky applications (Kim, Malaiya & Ray 2007, p. 147). In addition to increased research in data security, there are regulations that are imposed by the government to help companies overcome some of the activities that put company’s data security in jeopardy.

Vulnerability analysis argues that programs that are shown to have the highest rate of vulnerability are the faultiest. The belief has however been denounced and shown that the dependability of software solely depends on the coding of the software during processing. Some of the indicators to faulty software include execution-time faulty behavior and failure to execute as a result of faults in the coding of the software. The interesting part of the coding is that if the systems faulty codes are not executed, then there is no problem when it comes to the systems’ vulnerability analysis (Joh & Malaiya 2007, p. 434). These findings further describe vulnerability as the fault, which could be as a result of omission or inclusion in the process of coding of the systems. Therefore, software vulnerability is caused by a number of faults in codes that are utilized during the execution process of the software; these faults lead to security failures that make software vulnerable to various forms of attack (Joh & Malaiya 2007, p. 434). Despite the fact that there are vulnerability checks in the management of software processing, there are still other faults that are left.

Research indicate that some of the faults are as a result of lacking set standards to be followed by the certifying institutions these then will compromise the security of the information handled by the systems. In United States, certifying organization has created standard checks used to elucidate software security standards (Joh & Malaiya 2007, p. 434). Software security analysis will be looking forward to knowing the planning process during the creation of the software, the development process, verification process and the certification liaison process (Andriole 1986, p. 98). Software verification has specific components that can be followed during analysis, some of the components include the analysis of the requirements; this helps determine software requirements and the designing process (Andriole 1986, p. 98). The other component is the software architecture that will help detect and give a report of the errors that might have arisen during processing of software architecture (Andriole 1986, p. 99). Source code is the other component of verification process; it helps detect and report errors during coding (Andriole 1986, p. 99). Output of the integration process always ensures that the results obtained from integration are correct and complete. Last but not least is the test cases and the results obtained from their procedures; they tell if the testing was performed correctly and precisely (Andriole 1986, p. 99). Some of the faults can be detected in the initial stages software lifecycle.

Remember! This is just a sample
You can get your custom paper by one of our expert writers

The capability of there being predictive chances of software in the initial life cycles brings about a place of refuge for the companies that have faulty applications (Joh & Malaiya 2007, p. 435). This prediction reduces the number of unnecessary loses that accrue due to the non-performance of the software. Predicting problems in the software can be used in the determination of the extent of damage that may be caused by the failures in the systems and how the flaws can be avoided (Joh & Malaiya 2007, p. 435). Prediction is very important especially in the case of risk management and risk analysis since it gives the most appropriate measures to be taken. The most appropriate strategies that should be carried in the attempt to ensuring that data and sensitive information is secure. In software engineering, prediction of faults and failures are important in the process of prioritizing the risk, inspections and redrafting of more competent software (Andriole 1986, p. 100).

There is a difference in the definition of failures and faults in the fields of coding of software. In the classical explanation of failures and faults do not touch on the metrics and static analysis of the software. Metric analysis of the software codes is capable of causing an increase in the exposure of the faults. The analysis of applications actual mistakes should either target errors in the inclusion or missing out codes or a combination of the two probabilities (Joh & Malaiya 2007, p. 430). Reporting on vulnerabilities will depend on the method used in determining the application of the software either during the actual attack or during testing process (Joh & Malaiya 2007, p. 436).

On the same note, vulnerability and attack-prone are related terms according to their application and response as shown by the analyzing authorities and personnel. Faulty software depicts the highest chances of vulnerabilities since it gives time to hackers to act on flaws (Joh & Malaiya 2007, p. 436). The more the program is compromised, the more chances there are that it can be utilized by program attackers. In addition, within the systems there are fields that are secure such that attackers do not have the capability to establish an attack. If the program has a lower rate of proneness, the specialists will spend less time on then analysis of software (Joh & Malaiya 2007, p. 436). Software could be having the flaws but the manufacturer is not aware.

It is always within the wish of the analysts and the manufacturer that there are less percentages of security problems as compared to how effective the application is. However, there are chances that the application may have more percentages of failures as compared to the applications failures and errors (Joh & Malaiya 2007, p. 436). Research indicates that the number of failures in software may not always be attributed to failures of the application. There are other factors that may cause a breach of integrity of the application. Such failures are not attributed to failure prone or vulnerability prone aspects of attack but rather to software drivers and support programs (Joh & Malaiya 2007, p. 436). This has caused security groups to be grouped into categorical orders.

Security failures could be grouped into four groups but they are after the analysis of other risks that were considered very serious and disastrous to the effective running of software. Also it depends on the applicability of the software in the running of functions that it was intended to do. The categories include; remote attacks, insider attacks, audit capability and security documentations (Schryen 2009, p. 10).

In the case of remote attack, failures involved are indicated as a result of remote use of an application or a machine. Before and after release failure reports that had reported on similar problems, it acts as the remote sources of information. The sources of information offer concerns to the group and the kind of attacks if there were higher probabilities then the vulnerabilities are easily exploited.

In the case of insider attacks, this is a failure report which indicates that most of the faults were as a result of an external user of the machine since most of the failures were in the field of software application. The attacks that do not need a remote attack to the machine comprised the insider attack that could result from reckless use of systems and applications by the employees to an organization (Schryen 2009, p. 10).

Audit capability was involved in absent or weak logging or execution of important information as the software starts (Schryen 2009, p. 11). If there was a logging problem in the financial systems in a bank, they may cause attackers to have illegal access to customer’s accounts and may authorize transactions. Hence, there should be logs in software as an action to ensuring that it is secure to be used (Schryen 2009, p. 11).

We will write
a custom essay
specifically for you
Get your first paper with
15% OFF

Security documentation is another aspect of risky practices or components that may make software get vulnerable to attacks (Schryen 2009, p. 11). However, the category basically focuses on application of trust and competence on the side of the users (Schryen 2009, p. 11). If there is reluctance to trust among the users, then there is a security breach. Before one embarks into using an application, it is important to perform very critical and important transactions that are sensitive to software analysis. The analysis can be effectively done via models.

Software analysis models are designed by the analysts in order to provide an effective targeted analysis. The specific aspect of the evolution needs evaluators to develop metrics to analyze the security position of the application. To come up with appropriate models for the analysis of vulnerability of the software, the researchers used both internal and external metrics that were ASA produced external metric and SLOC for internal metrics. The metrics were intended to determine the density and the count of memory leaks, null pointers and buffer overflow alerts (Schryen 2009, p. 12). Also it determines the sum of the density and counts of three previously encountered attacks to the software. Lastly, ASA is meant to keep watch on the count and density of all alerts.

As stated SLOC is an internal metric that involves the count of SLOC and churn internal metrics. In the case of external metric used, the parameters were densities of pre-release, post-release and the total of pre- and post- release none functioning of the applications. (Alhazmi, Malaiya & Ray 2006, p. 220)

A number of researchers are of the opinion that it is important to integrate security components in the life cycle of software. There are cases that some developers and some of the learning institutions use one phase monitoring on the security of the software (Talib, Khelifi & Jololian 2010, p. 95). Most of the engineers trained in such organizations are only concerned about coding the software and they disregard security in the other steps of production (Talib, Khelifi & Jololian 2010, p. 95).

The lack of considering security at all levels of productions has limited the number of software productions that engineers have had. It also affects the corporations such that they are not able to produce software that is free from vulnerabilities. The reasons behind the inability to produce secure applications are attributed to lack of adequate knowledge on the engineers that are working on the software (Talib, Khelifi & Jololian 2010, p. 95). To attain the goals, IEEE computer society has developed guidelines to the activities and the practices that should be used during processing of software (Talib, Khelifi & Jololian 2010, p. 95). Research has indicated that there is the need for ensuring that developers are fed with appropriate information needed to ensure that software developed are fully managed and analyzed. The activities help determine any vulnerability chances in the running the software (Talib, Khelifi & Jololian 2010, p. 96).

The applicability of the software depends on the hardware that it is used on. Before recommending software to a user the analyst should determine the software/ hardware integration (Andriole 1986, p. 102). This needs to determine the computer environment that the program works best and deduce if it is drivers sensitive. Other than the hardware, it is also important to determine the integration of the software to other components of a computer and that it does not affect the normal running of other non-related software (Andriole 1986, p. 102). The other important thing in the determination of software security is the determination of the minimum virtue memory that is needed to run the application in a computer. Just like any other process, the analysis of software security has challenges.

Research indicates that most of the challenges that face software productions include; lack of clarity of the expectations and the method of communicating with software processing regulators (Hatcliff et al 2009, p. 16). Another thing is that developers do not have an idea of what they want to produce and they usually seek consultants’ advice. Lack of a clear definition of evidence and how it should be evaluated to come up with the best specifications is another challenge facing the process. In addition, there is lack or poor documentation of the requirements and the environment under which the software will be applied. Developers do not have enough information that will aid them come up with appropriately fitting conclusions (Hatcliff et al 2009, p. 16). Lack of information will be a problem to determination of metrics and attributes to software application.

In order to attain an effective mechanism in control of security of developed software, the developers should ensure that they use formal methods and development techniques that are effective (Hatcliff et al 2009, p. 19). Developers should also build dependability cases that include all the security concerns that are not based on the concerns that are raised to help determine the applicability of the software (Hatcliff et al 2009, p. 16). They should also make sure that they practice transparency and accountability as a virtue so that they are able to supply their customers with appropriate software that will meet their needs and specifications.

Conclusion

In conclusion, evaluation of the levels of vulnerability to software should be done and the findings stored for future use. Software developers should be involved in software research in the bid to ensuring that they have the specifics in software attacks and flaws. In addition, the analysis models should be considered when setting software analysis project. Developers should be very cautious right from the initial stages of generating software and they should maintain contact with the government agencies involved in protecting intellectual property. I recommend that software developers create interaction websites where their users can continually post reports on software flaws and bugs. In addition, the developers should continuously review the coding in their software.

References

Alhazmi, OH, Malaiya, YK & Ray I 2006, ‘Measuring, analyzing and predicting vulnerabilities in software systems’, Computers & Security, vol. 26, no. 3, pp. 219-228.

Andriole, SJ 1986, Software validation, verification, testing and documentation, Petrocelli Books, Princeton, NJ.

Hatcliff, J, Heimdah, lM, Lawford M, Maibaum T, Wassyng A & Wurden F 2009, ‘A software certification consortium and its top 9 hurdles’, Electronic Notes in Theoretical Computer Science, vol. 238, pp. 11-17.

Joh, H & Malaiya YK 2007, ‘A framework for software security risk evaluation using the vulnerability life-cycle and CVSS metrics’, IEEE Computer Society, pp. 430-434.

Kim, J, Malaiya, KY & Ray I 2007, ‘Vulnerability Discovery in Multi-Version Software Systems’, IEEE Computer Society, vol. 7, pp.141-148.

Sahinoglu, M 2008, ‘An input–output measurable design for the security meter model to quantify and manage software security risk’, IEEE Transactions on Instrumentation and Measurement, vol. 57, no. 6, pp. 1251- 1260.

Schryen, G 2009, ‘Security of open source and closed source software: An empirical comparison of published vulnerabilities’, Americas Conference on Information Systems, pp. 1-12.

Talib, MA, Khelifi A & Jololian L 2010, ‘Secure software enginnering: A new teaching perspective based on the SWEBOK’, Interdisciplinary Journal of Information, Knowledge, and Management, vol. 5, pp. 82-99.

Print
Need an custom research paper on The Software Certification Process written from scratch by a professional specifically for you?
808 writers online
Cite This paper
Select a referencing style:

Reference

IvyPanda. (2022, May 2). The Software Certification Process. https://ivypanda.com/essays/the-software-certification-process/

Work Cited

"The Software Certification Process." IvyPanda, 2 May 2022, ivypanda.com/essays/the-software-certification-process/.

References

IvyPanda. (2022) 'The Software Certification Process'. 2 May.

References

IvyPanda. 2022. "The Software Certification Process." May 2, 2022. https://ivypanda.com/essays/the-software-certification-process/.

1. IvyPanda. "The Software Certification Process." May 2, 2022. https://ivypanda.com/essays/the-software-certification-process/.


Bibliography


IvyPanda. "The Software Certification Process." May 2, 2022. https://ivypanda.com/essays/the-software-certification-process/.

Powered by CiteTotal, the best citation machine
If you are the copyright owner of this paper and no longer wish to have your work published on IvyPanda. Request the removal
More related papers
Cite
Print
1 / 1