Section D

Development of Evaluation System for Defense Informatization Level

Seungbae Sim1, Sangho Lee2,*
Author Information & Copyright
1Center for Military Analysis and Planning, Korea Institute for Defense Analyses, Seoul, Republic of Korea, sbsim@kida.re.kr
2Department of IT Management, Sun Moon University, Asan-si, Republic of Korea, slee@sunmoon.ac.kr
*Corresponding Author : Sangho Lee, 70 Sunmoon-ro 221beon-gil, Asan-si, Chungcheongnam-do, 31460, Republic of Korea, +82-41-530-2515, slee@sunmoon.ac.kr

© Copyright 2019 Korea Multimedia Society. This is an Open-Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/4.0/) which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.

Received: Nov 03, 2019; Revised: Nov 22, 2019; Accepted: Nov 25, 2019

Published Online: Dec 31, 2019

Abstract

There is a description that you cannot manage what you do not measure. The Korea Ministry of National Defense (MND) is conducting evaluations in various fields to obtain meaningful effects from IT investments, and views that the evaluation of the defense informatization sector is divided into defense informatization policy evaluation and defense informatization project evaluation. The defense informatization level evaluation can measure the informatization level of MND and the armed forces or organizations. Since the evaluation system being studied to measure the level of defense informatization is composed mainly of qualitative metrics, it is necessary to reconstruct it based on quantitative metrics that can guarantee objectivity. In addition, for managing the level of change by evaluation objects, the evaluation system should be designed with a focus on homeostasis of metrics so that it can be measured periodically. Moreover, metrics need to be promoted in terms of performance against targets. To this end, this study proposes to measure the level of defense informatization by dividing it into defense information network, computer systems, interoperability and standardization, information security, information environment, and information system use, and suggests their metrics.

Keywords: Defense Informatization Level; Evaluation Metric; Informatization Performance; IT Effect

I. INTRODUCTION

“You can't manage what you don't measure.” This sentence is alleged to have been claimed by Edward Deming or Peter Drucker [1] and explains the importance of measurement. The Korea Ministry of National Defense (MND) is conducting evaluations in various fields to obtain meaningful effects from IT investment. The evaluation in the defense informatization sector is divided into defense informatization policy evaluation and defense informatization project evaluation [2], [3]. The defense informatization policy evaluation assesses various policies of MND and military forces related to the defense informatization, and the defense informatization project evaluation assesses IT projects such as information system (IS) development projects, IT procurement projects, and IS operation projects that are being carried out by the MND, Army, Navy, and Air Force [4]. On the other hand, the defense informatization level evaluation can measure the informatization level such as the informatization capacity of the MND and the related organizations or agencies.

Defense informatization aims to achieve military missions and create results by utilizing information technology. If the level of defense informatization associated with input or process factors is improved, the outcome factor, which is a performance or net effect, can be improved. The purpose of measuring the level of informatization can be improving the level of informatization (input or process factors) apart from the performance management of informatization (output factors).

This study proposes the evaluation system to measure the level of defense informatization in terms of defense information network, computer systems, interoperability and standardization, information security, informatization environment, and IS use, and also describes quantitative evaluation metrics in each field.

II. RELATED WORKS

2.1. Information Systems Success Model

DeLone and McLean [5] suggested an information systems (IS) success model (Fig. 1) after reviewing various existing researches related to IS performance. The model suggests that both IS quality and information quality affect both IS use and user satisfaction, that both IS use and user satisfaction affect the individual performance of IS, and that the individual performance lastly affects organizational performance. In addition, the IS use affects and is affected by the user satisfaction. In the view of the IS success model, the IS performance depends on IS quality and information quality, which is related to an input factor, IS use and user satisfaction, which is a process factor.

jmis-6-4-271-g1
Figure. 1. DeLone and McLean’s IS success model Sources: DeLone and McLean [5, p. 87], Fig. 2.
Download Original Figure

DeLone and McLean [6] reviewed the many empirical studies applied with their model [5], and suggested updated IS success model (Fig. 2). They included service quality and intention to use in their IS success model [5] and added a feedback from net benefits to the intention to IS use and the user satisfaction. In the view of the updated IS success model, the net benefits (performance) depends on IS quality, information quality, service quality, intention to use, use, and user satisfaction. This model focuses quality of systems, information, service, IS use, user satisfaction as an informatization level.

jmis-6-4-271-g2
Figure. 2. DeLone and McLean’s updated IS success model. Sources: DeLone and McLean [6, p. 24], Fig. 3.
Download Original Figure
2.2. ITU ICT Development Index

International Telecommunication Union (ITU) measures the level of informatization for each country by calculating the Information and Communication Technology Development Index (IDI) [7, 8, 9]. IDI is measured in Information and Communication Technology (ICT) access, ICT use, and ICT skills. ICT access measures the fixed-telephone subscriptions per 100 inhabitants, the mobile-cellular telephone subscriptions per 100 inhabitants, the international Internet bandwidth (bit/s) per Internet user, the percentage of households with a computer, and the percentage of households with Internet access.

ICT use applies metrics such as the percentage of individuals using the Internet, the fixed-broadband subscriptions per 100 inhabitants, and the active mobile-broadband subscriptions per 100 inhabitants. ICT skills uses the metrics of the adult literacy rate, the secondary gross enrolment ratio, and the tertiary gross enrolment ratio.

The weights for ICT access, ICT use, and ICT skills are 40%, 40%, and 20%, respectively. The same weights are assigned to the metrics within each item. In 2017, ITU member countries decided to revise and expand the IDI [10], but the results have not yet been published as of October 2019. Infrastructure, use, and skills (user capability) are evaluated in the IDI.

2.3. Evaluation Method of Korean Government

The Korean government's evaluation of informatization capability (level) is being implemented as part of the evaluation of administrative management capability to improve the system in the informatization field [11]. This evaluation is managed by the Ministry of the Interior and Safety, which is based on the Framework Act on Public Service Evaluation [12]. Organizations to be evaluated are 44 central government departments including the MND.

The Ministry of the Interior and Safety conducts a preliminary evaluation of the informatization field with the National Information Society Agency (NIA), and notifies the preliminary evaluation results to each central government department. The central administrative departments finally evaluate their informatization level themselves with the results.

The evaluation system of the informatization field in 2019 shows that a perfect score is 35 points, and measures the implementation of effective e-government (20 points) and the reinforcement of cyber safety level (15 points). Evaluation metrics and their weights are shown in Table 1. The evaluation system also describes the evaluation formula as well as the evaluation method for each metric [11].

Table 1. Metrics for national informatization level evaluation in Korea.
Type Evaluation metric Weight
Implementation of effective e-government (20 points) Propulsion of performance management of e-government 14
Measurement of information resource management level 3
Implementation level of phased performance management 7
Prevention of redundant investment 4
Efficiency of website operation management 6
Web compatibility/access level 3
Website plug-in removal ratio 3
Reinforce-ment of cyber safety level (15 points) Privacy protection level 4
E-government civil service information security level 4
Cyber security management level 6
Administrative information security level 3
Cyber crisis management and technical information security level 3
Cyber-attack response training result 1
Download Excel Table

A survey on informatization level for small and medium enterprises (SMEs) is conducted by the Ministry of SMEs and Startups and Korea Technology & Information Promotion Agency for SMEs. This survey is for establishing informatization strategies and establishing policy directions for supporting SMEs [13]. It examines the general status of companies, the willingness and plans for informatization reinforcement, the environment for informatization reinforcement, the current status of IS implementation and use, the level of informatization effectiveness, and smart factories and new ICT technologies. Since 2012, the evaluation of SMEs informatization level has not been carried out using all the surveyed items, but only three items: the willingness and plans for informatization reinforcement, the environment for informatization reinforcement, and the current status of IS implementation and use.

Many researches have been conducted on the evaluation system for defense informatization level and the metrics for each item. Lim et al. [14] suggest to measure the level of defense informatization in terms of informatization infrastructure, informatization environment, informatization use, and informatization performance. In the informatization infrastructure, defense information network, computer systems, interoperability and standardization are measured. In the informatization environment, information security, organization, education, investment, and strategy of informatization are measured. The use of informatization measures the use of battlefield management information system (MIS) and resource MIS. The informatization performance measures the level of informatization combat power and the improvement of the efficiency of defense management. Also, various evaluation metrics are presented in the study [14]. The weights of evaluation items or metrics were calculated using Analytic Hierarchy Process (AHP). It is speculated that a method of calculating the evaluation metrics may have been developed, but the study [14] did not provide it. In order to demonstrate the usability of the developed evaluation system for defense informatization level, the informatization level was measured for military units in the operation command or higher with surveys on current status and questionnaires.

For other methods of informatization level measurement, such as the corporate informatization level evaluation, and a survey on the SMEs informatization level prior to 2012, refer the existing study [14].

Since the evaluation system of defense informatization level is mainly composed of qualitative evaluation metrics, it is necessary to reconstruct it based on quantitative evaluation metrics that can guarantee objectivity [15]. It should be able to manage the increase and decrease of the level by evaluation area. It should be designed with a focus on homeostasis of metrics so that it can be measured periodically. In addition, it is necessary to refrain from evaluating all evaluation metrics using the 5-point scale without a clear comparison, and to promote the evaluation in terms of achieved performance against targets. The method of measuring the informatization level with a questionnaire is simple, but there are limitations. This may be a subjective result by the survey respondent's perception rather than an objective one. The informatization level may be measured differently by the change of the survey respondent even though the organization or institution was the same informatization level. Longitudinal analysis also needs to be available for level improvement or trend analysis.

The level evaluation can be performed in the manner of improvement of metrics compared to the previous year, the level comparison to domestic or overseas organizations (institutes), and absolute maturity level. In the short term, it is advisable to conduct a level assessment in order to compare the level of metric improvement compared to the previous year or to compare the level of metrics with other organizations. To use the evaluation system for the informatization maturity level, at least several years of data must be accumulated and available.

III. EVALUATION FOR DEFENSE INFORMATIZATION LEVEL

This study describes an evaluation system for measuring the level of defense informatization. Defense informatization level assessment can be classified into IT infrastructure, informatization environment, and IS use according to the domain of evaluation (Table 2). In particular, the level assessment for IT infrastructure is evaluated by subdividing it into areas such as Defense information network (A), Computer systems (B), Interoperability and standardization (C), and Information security(D) [15]. In the Defense information network (A), speed, traffic, and availability/recovery time are measured. Server utilization, server availability, server throughput, and work efficiency are measured in Computer systems (B). Interoperability among ISs, standardization of reference information (master data), and interoperability between weapon systems and IS are measured in Interoperability and standardization (C). The detection level (capability) of infringement incident, the response level (capability) of infringement incident, and the recovery of infringement incident are measure in Information security (D).

Table 2. Evaluation metrics for defense informatization level.
Item Metric
IT infrastructure – Defense information network(A)
 Speed(A.1) <A-1-1>Average speed among main nodes
<A-1-2> Average speed between main node and secondary node (branch line)
<A-1-3> Average speed between secondary node and user
 Traffic(A.2) <A-2-1> Average traffic among main nodes
<A-2-2> Average traffic between main node and secondary node (branch line)
<A-2-3> Average traffic between secondary node and user
 Availability/recovery time (A.3) <A-3-1> Network availability
<A-3-2> Recovery time
IT infrastructure – Computer systems(B)
 Server utilization(B.1) <B-1-1> Average (maximum) CPU use ratio
<B-1-2> Average storage use ratio
 Server availability (B.2) <B-2-1> Server availability
<B-2-2> Server recovery time
 Server throughput (B.3) <B-3-1> Average tpmC (Transaction per minute by council)
<B-3-2> Average response time
 Work efficiency (B.4) <B-4-1> Input automation level (Input interface level)
IT infrastructure – Interoperability and standardization(C)
 Interoperability among ISs(C.1) <C-1-1> Average interoperability level among battlefield management information systems (MISs)
<C-1-2> Average data interconnection ratio among battlefield MISs
<C-1-3> Average interoperability level between battlefield MIS and resource MIS
<C-1-4> Average data interconnection level between battlefield MIS and resource MIS
<C-1-5> Average interoperability level among resource MISs
<C-1-6> Average data interconnection level among resource MISs
 Standardization of reference information (master data) (C.2) <C-2-1> Standardization ratio of reference information/code data (master data)
 Interoperability between weapon systems and IS (C.3) <C-3-1> Average interoperability level between weapon systems and battlefield MIS
<C-3-2> Average data interconnection ratio between weapon systems and battlefield MIS
IT infrastructure – Information security(D)
 Detection level (capability) of infringement incident(D.1) <D-1-1> Intrusion detection ratio in advance
 Response level (capability) of infringement incident(D.2) <D-2-1> Number of intrusion incidents per year
<D-2-2> Intrusion incident response time
 Recovery of infringement incident(D.3) <D-3-1> Trace ratio of infringement incidents
<D-3-2> Recovery time after infringement incident
Informatization environment(E)
 Efforts to improve informatization capability(E.1) <E-1-1> Annual time of informatization education
 Informatization master plan(E.2) <E-2-1> Informatization master plan
 Efficient execution of budget(E.3) <E-3-1> Efficiency of budget execution
IS use(F)
 Business informatization (F.1) <F-1-1> Ratio of tasks implemented by IS
 Business use(F.2) <F-2-1> Business use
Download Excel Table

The level evaluation for defense information network is performed using metrics for Speed (A.1), Traffic (A.2), and Availability/recovery time (A.3). Table 3 shows the evaluation metrics for Speed (A.1). These are average speed among main nodes (main lines), average speed between the main node and the secondary node (branch line), and average speed between the secondary node and the user (user network). The equation is the ratio of measured speed of network against maximum speed (designed speed) to the target ratio. The data for network speed can be acquired from network management systems (NMS).

Table 3. Metrics for Speed (A.1).
Item Description
Evaluation item Defense information network(A) >> Speed(A.1)
Metric <A-1-1> Average speed among main nodes
<A-1-2> Average speed between main node and secondary node (branch line)
<A-1-3> Average speed between secondary node and user
Explanation Average speed among main nodes, the main node and the secondary node, and the secondary node and the user in the defense information network
Measurement method X = m e a s u r e d s p e e d o f n e t w o r k max i m u m s p e e d ( d e s i g n e d s p e e d )
Y=XTargetvalue×100 (satisfaction ratio of network speed)
Data gathering method (Data sources) ■ System □ Data
□ Questionnaires □ Interview
※ Measure with network management systems (NMS)
Download Excel Table

Table 4 describes metrics for traffic. The evaluation metrics for Traffic (A.2), like the metrics for Speed (A.1), measure a traffic on trunk, branch, and user network. These measure if an excessive traffic against a designed or estimated traffic occurs. The data for network traffic can be acquired from NMS.

Table 4. Metrics for Traffic (A.2).
Item Description
Evaluation item Defense information network(A) >> Traffic(A.2)
Metric <A-2-1> Average traffic among main nodes
<A-2-2> Average traffic between main node and secondary node (branch line)
<A-2-3> Average traffic between secondary node and user
Explanation Average traffic among main nodes, the main node and the secondary node, and the secondary node and the user in the defense information network
Measurement method X = m e a s u r e d u p / d o w n t r a f f i c i n n e t w o r k a v a l i a b l e t r a f f i c a s s i g n e d t o n e t w o r k
Y=XTargetvalueTargetvalue×100 (excess ratio of traffic)*
Data gathering method (Data sources) ■ System □ Data
□ Questionnaires □ Interview
※ Measure average, minimum, and maximum traffic of network with network management systems (NMS)

* Note. The scoring method of metrics in the form of excess ratios: 0% or less, 100 points; 0 to 20% or less, 80 points; 20 to 40% or less, 60 points; 40 to 60% or less, 40 points; 60 to 80% or less, 20 points; 80% or more, 0 points.

Download Excel Table

The evaluation of Availability/recovery time (A.3) metric calculates availability with a probability that the network can be serviced without interruption, and manages data on network average recovery time to calculate availability. Table 5 presents a metric for network availability. The data for network availability can be acquired from NMS. The mean time to failure (MTTF) is average time of network service between failure of network and failure of network.

Table 5. <A-3-1> Network availability metric.
Item Description
Evaluation item Defense information network(A) >> Traffic(A.3)
Metric <A-3-1> Network availability
Explanation A probability that the defense information network can be serviced without interruption
Measurement method X = MTTF/ ( MTTF+MTTR )
Y=XTargetvalue×100 (satisfaction ratio of availability)
※ MTTF (Mean Time To Failure): average time of network service
※ MTTR (Mean Time To Repair): average time to recovery after network service outage
Data gathering method (Data sources) ■ System □ Data
□ Questionnaires □ Interview
※ Measure with network management systems (NMS)
Download Excel Table

Table 6 shows the measurement method for the Recovery time metric. This calculates MTTR (Mean Time To Repair), and suggests providing service within target time of recovery. Shorter MTTR is better.

Table 6. <A-3-2> Recovery time metric.
Item Description
Evaluation item Defense information network(A) >> Traffic(A.3)
Metric <A-3-1> Network availability
Explanation A probability that the defense information network can be serviced without interruption
Measurement method X = MTTR
Y=XTargetvalueTargetvalue×100 (excess ratio of recovery time)
※ MTTR (Mean Time To Repair): average time to recovery after network service outage
Data gathering method (Data sources) ■ System □ Data
□ Questionnaires □ Interview
※ Measure with network management systems (NMS)
Download Excel Table

Other metrics are shown in appendix.

IV. CONCLUSION

This study describes the defense informatization level evaluation system modified from the defense informatization evaluation methodology [4]. The proposed evaluation system for the defense informatization level measures in terms of IT infrastructure, informatization environment, and IS use [15]. This does not focus on a survey method but the direct measurement of the level if possible. The measurement effort should be put into the evaluation. The evaluation method that was being implemented by the government is accommodated as much as possible so that the efficient evaluation can be carried out while reducing the burden on the defense organization.

As with most researches and methodologies, there are limitations in the proposed evaluation system. It is necessary to set the target value for each metric. One also needs to consider metrics such as power usage effectiveness (PUE), which are used as metrics for US government data centers [16, 17]. In addition, the study should have provided some cases or results applying the proposed evaluation system, but it does not due to a defense security issue.

Rather than waiting for the development of a sufficiently reasonable and theoretically complete evaluation system, it is better to complement the evaluation system by measuring the level of informatization. It is more significant to develop and apply the evaluation system that users can intuitively understand or use. Accumulated experiences with repetitive use of the evaluation system can lead to lessons learned and complementary needs, which can make the evaluation system more robust. Through such a virtuous cycle, the evaluation system of the defense informatization level and measurement metrics that are actively accepted by the stakeholders of the evaluation can be developed.

Acknowledgements

This manuscript is based on Research Report [15]. The authors wish to thank the editors and the anonymous reviewers for their careful reviews and constructive suggestions. Their suggestions helped strengthen the paper. All errors are the sole responsibility of the authors.

REFERENCES

[1].

A. McAfee and E. Brynjolfsson, “Big data: The management revolution,” Harvard Business Review, pp. 60-68, October 2012.

[2].

Korea Ministry of National Defense (MND), Act on Establishment of Infrastructure for Informatization of National Defense and Management of Informational Resources for National Defense (Abbreviation: Act on Defense Informatization), Act No. 12553, http://elaw.klri.re.kr/kor_service/lawView.do?hseq=32670&lang=ENG, May 9, 2014.

[3].

Korea Ministry of National Defense (MND), Defense Informatization Task Directive, MND Directive No. 2129, http://www.law.go.kr/행정규칙/국방정보화업무훈령, Feb. 5, 2018. (In Korean)

[4].

H. J. Kwon, J. S. Choi, S. T. Kim, H. J. Lee, and Y. P. Sung, “A study for improving an evaluation systems of defense informatization,” Korea Institute for Defense Analyses, Seoul, Republic of Korea, Research Report, Feb. 2012. (In Korean)

[5].

W. H. DeLone and E. R. McLean, “Information systems success: The quest for the dependent variable,” Information System Research, vol. 3, no. 1, pp. 60-95, 1992.

[6].

W. H. DeLone and E. R. McLean, “The DeLone and McLean model of information systems success: A ten-year update,” Journal of Management Information Systems, vol. 19, no. 4, pp. 9-30, Spring 2003.

[7].

International Telecommunication Union, “Measuring the Information Society Report-Executive summary,” ITU Publications, 2010-2015.

[8].

International Telecommunication Union, “Measuring the Information Society Report-Executive Summary,” ITU Publications, 2017.

[9].

E. J. Lee and B. M. Ahn, “Evaluation of Informatization Level of ITU (IDI, ICT Development Index),” Korea Institute of S&T Evaluation and Planning, Seoul, Republic of Korea, KISTEP Statistic Brief, 2013-26, 2013. (In Korean)

[10].

International Telecommunication Union, “Measuring the Information Society Report,” Volume 1, Statistical Reports, ITU Publications, 2018.

[11].

Korea Ministry of the Interior and Safety, “2019 Central Government Department Self-evaluation (Public Administration Capability Part) Plan (Draft),” May 17, 2019. (In Korean)

[12].

Office for Government Policy Coordination, Framework Act on Public Service Evaluation, Act No. 14839, July 26, 2017. (In Korean)

[13].

Ministry of SMEs and Startups, and Korea Technology & Information Promotion Agency for SMEs, “2018 Survey on the Information Level of Korean Small and Medium Enterprise,” Daejeon, Republic of Korea, TIPA Research Report 19-01, May 31, 2019. (In Korean); https://www.mss.go.kr/site/smba/foffice/ex/statDB/StReportContentDetailView.do?gb=1&reSeq=1712.

[14].

G. G. Lim, D. C. Lee, H. J. Kwon, and S. R. Cho, “A case of developing performance evaluation model for Korean defense informatization,” Information Systems Review, Vol. 19, No. 3, pp. 23-45, 2017. (In Korean)

[15].

S. Lee, H. S. Jung, and S. J. Yoon, “An application with an evaluation methodology for defense informatization and validating the methodology,” Sun Moon University, Asan, Republic of Korea, Research Report, Nov. 2012. (In Korean)

[16].

U.S. Department of Energy, “Data center metering and resource guide,” February 2017; https://datacenters.lbl.gov/sites/all/files/DataCenterMeteringandResourceGuide_02072017.pdf.

Authors

Seungbae Sim

jmis-6-4-271-i1

Seungbae Sim is a research fellow in the Center for Military Analysis and Planning at Korea Institute for Defense Analyses, Republic of Korea. He received the B.S. and M.S. degree in Industrial Systems Engineering from Yonsei University and Ph.D. degree in Information and Industrial Engineering from Yonsei University. His research interests include software policy, software productivity, analysis and audit of information system, IT project management, and data science.

Sangho Lee

jmis-6-4-271-i2

Sangho Lee is an associate professor in the Department of IT Management at Sun Moon University, Republic of Korea. He received his Ph.D. from Korea Advanced Institute of Science Technology (KAIST) and Master/Bachelor's degree at Sungkyunkwan University. His research interests are the business value of IT, the evaluation of IT investment projects, software engineering, and causality between IT investment and performance.

Appendices

APPENDIX
Table A1. <B-1-1> Average (maximum) CPU use ratio metric.
Item Description
Evaluation item Computer systems(B) >> Server utilization(B.1)
Metric <B-1-1> Average (maximum) CPU use ratio
Explanation Average CPU use ratio of IS server
Measurement method X = Daily / Monthly / Yearly average (maximum) CPU use ratio
Y=XTargetValueTargetValue×100 (excess ratio of CPU use)
Data gathering method (Data sources) ■ System   □ Data
□ Questionnaires   □ Interview
※ Measure with server management systems (SMS)
Download Excel Table
Table A2. <B-1-2> Average storage use ratio metric.
Item Description
Evaluation item Computer systems(B) >> Server utilization(B.1)
Metric <B-1-2> Average storage use ratio
Explanation Average storage use ratio of IS server
Measurement method X = daily / monthly / yearly average storage use ratio
Y=XTargetValueTargetValue×100 (excess ratio of storage use)
Data gathering method (Data sources) ■ System   □ Data
□ Questionnaires   □ Interview
※ Measure with server management systems (SMS)
Download Excel Table
Table A3. <B-2-1> Server availability metric.
Item Description
Evaluation item Computer systems(B) >> Server availability(B.2)
Metric <B-2-1> Server availability
Explanation A probability that the server can be serviced without interruption
Measurement method X = MTTF / (MTTF + MTTR)
Y=XTargetValue×100 (satisfaction ratio of availability)
※ MTTF (Mean Time To Failure): average time of server service
※ MTTR (Mean Time To Repair): average time to recovery after server service outage
Data gathering method (Data sources) ■ System   □ Data
□ Questionnaires   □ Interview
※ Measure with server management systems (SMS)
Download Excel Table
Table A4. <B-2-2> Server recovery time metric.
Item Description
Evaluation item Computer systems(B) >> Server availability(B.2)
Metric <B-2-2> Server recovery time
Explanation Average time to recovery after server service outage
Measurement method X = MTTR
Y=XTargetValueTargetValue×100 (excess ratio of recovery time)
※ MTTR (Mean Time To Repair): average time to recovery after server service outage
Data gathering method (Data sources) ■ System   □ Data
□ Questionnaires   □ Interview
※ Measure with server management systems (SMS)
Download Excel Table
Table A5. <B-3-1> Average tpmC metric.
Item Description
Evaluation item Computer systems(B) >> Server throughput(B.3)
Metric <B-3-1> Average tpmC (transaction per minute by council)
Explanation Average transaction processing speed of IS server
Measurement method X = average tpmC
Y=XTargetValue×100 (satisfaction ratio of throughput speed)
Data gathering method (Data sources) ■ System   ■ Data
□ Questionnaires   □ Interview
※ Measure with server management systems (SMS) and server specification
Download Excel Table
Table A6. <B-3-2> Average response time metric.
Item Description
Evaluation item Computer systems(B) >> Server throughput(B.3)
Metric <B-3-2> Average response time
Explanation Average response time of IS server to client requirement
Measurement method X = Average response time of IS server
Y=XTargetValueTargetValue×100 (excess ratio of response time)
Data gathering method (Data sources) ■ System   ■ Data
□ Questionnaires   □ Interview
※ Measure with server management systems (SMS) and server specification
Download Excel Table
Table A7. <B-4-1> Input automation level (Input interface level) metric.
Item Description
Evaluation item Computer systems(B) >> Work efficiency (B.4)
Metric <B-4-1> Input automation level (Input interface level)
Explanation Automation level of data input in IS
Measurement method X = # o f d a t a i n p u t w b a r c o d e / R E F I D / Q R / c o d e t o t a l # o f d a t a i n p u t
Y= X T a r g e t V a l u e × 100
Data gathering method (Data sources) ■ System   ■ Data
□ Questionnaires   □ Interview
Download Excel Table
Table A8. <C-1-1> Average interoperability level among battlefield MISs metric.
Item Description
Evaluation item Interoperability and standardization(C) >> Interoperability among ISs(C.1)
Metric <C-1-1> Average interoperability level among battlefield MISs
Explanation Average Levels of Information System Interoperability (LISI) among battlefield MISs
Measurement method X = Average LISI among battlefield MISs
Y=XTargetValue×100 (satisfaction ratio of interoperability)
Data gathering method (Data sources) □ System   ■ Data
□ Questionnaires   □ Interview
※ Use the LISI result measured by Defense Interoperability Center
Download Excel Table
Table A9. <C-1-2> Average data interconnection ratio among battlefield MISs metric.
Item Description
Evaluation item Interoperability and standardization(C) >> Interoperability among ISs(C.1)
Metric <C-1-2> Average data interconnection ratio among battlefield MISs
Explanation Average data interconnection ratio of interface among battlefield MISs
Measurement method X = # o f r e a l i z e d i n t e r c o n n e c t i o n a m o n g I S s # o f r e q u i r e d i n t e r c o n n e c t i o n a m o n g I S s
Y=XTargetValue×100 (satisfaction ratio of data interconnection)
Data gathering method (Data sources) ■ System   ■ Data
□ Questionnaires   □ Interview
※ Measure the number of the realized interconnection from ISs and the number of the required interconnection from Requirement of Capability (ROC) / Information Exchange Requirement (IER) / System/Subsystem Specification (SSS)
Download Excel Table
Table A10. <C-1-3> Average interoperability level between battlefield MIS and resource MIS metric.
Item Description
Evaluation item Interoperability and standardization(C) >> Interoperability among ISs(C.1)
Metric <C-1-3> Average interoperability level between battlefield MIS and resource MIS
Explanation Average LISI between battlefield MIS and resource MIS
Measurement method X = Average LISI between battlefield MIS and resource MIS
Y=XTargetValue×100 (satisfaction ratio of interoperability)
Data gathering method (Data sources) □ System   ■ Data
□ Questionnaires   □ Interview
※ Use the LISI result measured by Defense Interoperability Center
Download Excel Table
Table A11. <C-1-4> Average data interconnection ratio between battlefield MIS and resource MIS metric.
Item Description
Evaluation item Interoperability and standardization(C) >> Interoperability among ISs(C.1)
Metric <C-1-4> Average data interconnection ratio between battlefield MIS and resource MIS
Explanation Average data interconnection ratio of interface between battlefield MIS and resource MIS
Measurement method X = # o f r e a l i z e d i n t e r c o n n e c t i o n a m o n g I S s # o f r e q u i r e d i n t e r c o n n e c t i o n a m o n g I S s
Y=XTargetValue×100 (satisfaction ratio of data interconnection)
Data gathering method (Data sources) ■ System   ■ Data
□ Questionnaires   □ Interview
※ Measure the number of the realized interconnection from ISs and the number of the required interconnection from Requirement of Capability (ROC) / Information Exchange Requirement (IER) / System/Subsystem Specification (SSS)
Download Excel Table
Table A12. <C-1-5> Average interoperability level among resource MISs metric.
Item Description
Evaluation item Interoperability and standardization(C) >> Interoperability among ISs(C.1)
Metric <C-1-5> Average interoperability level among resource MISs
Explanation Average LISI (Levels of Information System Interoperability) among resource MISs
Measurement method X = Average LISI among resource MISs
Y=XTargetValue×100 (satisfaction ratio of interoperability)
Data gathering method (Data sources) □ System   ■ Data
□ Questionnaires   □ Interview
※ Use the LISI result measured by Defense Interoperability Center
Download Excel Table
Table A13. <C-1-6> Average data interconnection ratio among resource MISs metric.
Item Description
Evaluation item Interoperability and standardization(C) >> Interoperability among ISs(C.1)
Metric <C-1-6> Average data interconnection ratio among resource MISs
Explanation Average data interconnection ratio of interface among resource MISs
Measurement method X = # o f r e a l i z e d i n t e r c o n n e c t i o n a m o n g I S s # o f r e q u i r e d i n t e r c o n n e c t i o n a m o n g I S s
Y=XTargetValue×100 (satisfaction ratio of data interconnection)
Data gathering method (Data sources) ■ System   ■ Data
□ Questionnaires   □ Interview
※ Measure the number of the realized interconnection from ISs and the number of the required interconnection from Requirement of Capability (ROC) / Information Exchange Requirement (IER) / System/Subsystem Specification (SSS)
Download Excel Table
Table A14. <C-2-1> Standardization ratio of reference information/code data (master data) metric.
Item Description
Evaluation item Interoperability and standardization(C) >> Standardization of reference information (master data)(C.2)
Metric <C-2-1> Standardization ratio of reference information/code data (master data)
Explanation The ratio of standardized code to total code data/reference information
Measurement method X = # o f s t a n d a r d i z e d c o d e i n d e f e n c e I S s t o t a l # o f c o d e d a t a i n d e f e n c e I S s
Y=XTargetValue×100 (satisfaction ratio of standardization)
Data gathering method (Data sources) ■ System   ■ Data
□ Questionnaires   □ Interview
Download Excel Table
Table A15. <C-3-1> Average interoperability level between weapon systems and battlefield MIS metric.
Item Description
Evaluation item Interoperability and standardization(C) >> Interoperability between weapon systems and IS(C.3)
Metric <C-3-1> Average interoperability level between weapon systems and battlefield MIS
Explanation Average LISI (Levels of Information System Interoperability) between weapon systems and battlefield MIS
Measurement method X = Average LISI between weapon systems and battlefield MIS
Y=XTargetValue×100 (satisfaction ratio of interoperability)
Data gathering method (Data sources) □ System   ■ Data
□ Questionnaires   □ Interview
※ Use the LISI result measured by Defense Interoperability Center
Download Excel Table
Table A16. <C-3-2> Average data interconnection ratio between weapon systems and battlefield MIS metric.
Item Description
Evaluation item Interoperability and standardization(C) >> Interoperability between weapon systems and IS(C.3)
Metric <C-3-2> Average data interconnection ratio between weapon systems and battlefield MIS
Explanation Average data interconnection ratio of interface between weapon systems and battlefield MIS
Measurement method X= # o f r e a l i z e d i n t e r c o n n e c t i o n # o f r e q u i r e d i n t e r c o n n e c t i o n
Y=XTargetValue×100 (satisfaction ratio of data interconnection)
Data gathering method (Data sources) ■ System   ■ Data
□ Questionnaires   □ Interview
※ Measure the number of the realized interconnection from ISs and the number of the required interconnection from ROC / IER / SSS
Download Excel Table
Table A17. <D-1-1> Intrusion detection ratio in advance metric.
Item Description
Evaluation item Information security(D) >> Detection level (capability) of infringement incident(D.1)
Metric <D-1-1> Intrusion detection ratio in advance
Explanation The ratio of detecting the infringement incident in advance
Measurement method X= # o f d e t e c t i o n i n a d v a n c e t o t a l # o f i n f r i n g e m e n t i n c i d e n t
Y=XTargetValue×100 (satisfaction ratio of detection in advance)
Data gathering method (Data sources) ■ System   ■ Data
□ Questionnaires   □ Interview
※ Measure the total number of infringement incident from target IS and the number of detection in advance from the data of Computer Emergency Response Team (CERT)
Download Excel Table
Table A18. <D-2-1> Number of intrusion incidents per year metric.
Item Description
Evaluation item Information security(D) >> Response level (capability) of infringement incident(D.2)
Metric <D-2-1> Number of intrusion incidents per year
Explanation The number of the infringement incident per year in organization
Measurement method X = The number of the infringement incident per year in organization
Y=XTargetValueTargetValue×100 (excess ratio of infringement incident occurrence)
Data gathering method (Data sources) ■ System □ Data
□ Questionnaires □ Interview
※ Measure the total number of infringement incident from target IS
Download Excel Table
Table A19. <D-2-2> Intrusion incident response time metric.
Item Description
Evaluation item Information security(D) >> Response level (capability) of infringement incident(D.2)
Metric <D-2-2> Intrusion incident response time
Explanation The response completion time following the response procedure after the occurrence of the infringement incident (the situation ending time)
Measurement method X = Average processing time till the situation ending after detecting the infringement incident
Y=XTargetValueTargetValue×100 (excess ratio of processing time)
Data gathering method (Data sources) ■ System ■ Data
□ Questionnaires □ Interview
※ Measure the occurrence of infringement incident from target IS and the processing time from related reports
Download Excel Table
Table A20. <D-3-1> Trace ratio of infringement incidents metric.
Item Description
Evaluation item Information security(D) >> Recovery of infringement incident(D.3)
Metric <D-3-1> Trace ratio of infringement incidents
Explanation The success ratio of trace about the origin of the infringement incident
Measurement method X = # o f t r a c e o f inf r i n g e m e n t i n c i d e n t t o t a l # o f inf r i n g e m e n t i n c i d e n t
Y = X T arg e t V a l u e × 100
Data gathering method (Data sources) ■ System ■ Data
□ Questionnaires ■ Interview
※ Measure the occurrence of infringement incident from target IS and the trace information from related reports
Download Excel Table
Table A21. <D-3-2> Recovery time after infringement incident metric.
Item Description
Evaluation item Information security(D) >> Recovery of infringement incident(D.3)
Metric <D-3-2> Recovery time after infringement incident
Explanation Average time to recovery after infringement incident
Measurement method X = Average time to recovery after the infringement incident
Y=XTargetvalueTargetValue×100 (excess ratio of time to recovery)
Data gathering method (Data sources) ■ System ■ Data
□ Questionnaires ■ Interview
※ Measure the occurrence of infringement incident from target IS and the recovery time from related reports
Download Excel Table
Table A22. <E-1-1> Annual time of informatization education metric.
Item Description
Evaluation item Informatization environment(E) >> Efforts to improve informatization capability(E.1)
Metric <E-1-1> Annual time of informatization education
Explanation Annual time of informatization education
Measurement method X = Annual time of informatization education per person in organization
Y=Min(X,TargetValue)TargetValue×100 (completion ratio of informatization education)
Data gathering method (Data sources) □ System ■ Data
□ Questionnaires □ Interview
Download Excel Table
Table A23. <E-2-1> Informatization master plan metric.
Item Description
Evaluation item Informatization environment(E) >> Informatization master plan(E.2)
Metric <E-2-1> Informatization master plan
Explanation Enterprise Architecture (EA) maturity level measured by National Information Society Agency (NIA) method and the execution level of basic informatization plan
Measurement method X1 = (EA maturity level) / 5
X 2 = # o f e x e c u t e d i t e m s i n m a s t e r p l a n t o t a l # o f i t e m s i n m a s t e r p l a n
Y = ( w 1 X 1 T arg e t V a l u e 1 + w 2 X 2 T arg e t V a l u e 2 ) × 100 , w 1 + w 2 = 1
Data gathering method (Data sources) □ System ■ Data
□ Questionnaires □ Interview
Download Excel Table
Table A24. <E-3-1> Efficiency of budget execution metric.
Item Description
Evaluation item Informatization environment(E) >> Efficient execution of budget(E.3)
Metric <E-3-1> Efficiency of budget execution
Explanation Efficiency level of budget execution: executed budget against planned budget, the ratio of budget for new IT project to informatization budget, and the ratio of informatization budget to total defense budget
Measurement method X 1 = e x e c u t e d b u d g e t p l a n n e d b u d g e t
X 2 = b u d g e t f o r n e w I T p r o j e c t inf o r m a t i z a t i o n b u d g e t
X 3 = inf o r m a t i z a t i o n b u d g e t t o t a l d e f e n s e b u d g e t
Y= ( w 1 X 1 T arg e t v a l u e ( T V ) 1 + w 2 X 2 T V 2 + w 3 X 3 T V 3 ) × 100 , w 1 + w 2 + w 3 = 1
Data gathering method (Data sources) □ System ■ Data
□ Questionnaires □ Interview
Download Excel Table
Table A25. <F-1-1> Ratio of tasks implemented by IS metric.
Item Description
Evaluation item IS use(F) >> Business informatization(F.1)
Metric <F-1-1> Ratio of tasks implemented by IS
Explanation Ratio of tasks implemented by IS to total defense tasks (Informatization level of defense tasks)
Measurement method X= # o f t a s k s i m p l e m e n t e d b y I S t o t a l # o f d e f e n s e t a s k s
Y = X T arg e t v a l u e × 100
Data gathering method (Data sources) ■ System ■ Data
□ Questionnaires □ Interview
Download Excel Table
Table A26. <F-2-1> Business use metric.
Item Description
Evaluation item IS use(F) >> Business use(F.2)
Metric <F-2-1> Business use
Explanation The ratio of tasks with IS in the defense tasks
Measurement method X 1 = w o r k i n g h o u r w i t h I S t o t a l w o r k i n g h o u r
X 2 = # o f t a s k s w i t h I S t o t a l # o f t a s k s
Y= ( w 1 X 1 T arg e t v a l u e 1 + w 2 X 2 T arg e t V a l u e 2 ) × 100 , w 1 + w 2 = 1
Data gathering method (Data sources) □ System ■ Data
■ Questionnaires ■ Interview
Download Excel Table