Search results for: quality data assurance
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 30492

Search results for: quality data assurance

30402 Enablers of Total Quality Management for Social Enterprises: A Study of UAE Social Organizations

Authors: Farhat Sultana

Abstract:

Originality: TQM principles are considered the tools to enhance organizational performance for most organizations. The paper contributes to the literature on the social enterprise because social organizations are still far behind in implementing TQM as compared to other private, public, and nonprofit organizations. Study design: The study is based on the data and information provided by two case studies and one focus group of social enterprises. Purpose: The purpose of the study is to get a deep understating of TQM implementation and to recognize the enablers of TQM in social enterprises that enhance the organizational performance of social enterprises located in UAE. Findings: As per the findings of the study, key enablers of Total Quality management in the case enterprises are leadership support, strategic approach for quality, continuous improvement, process improvement, employee empowerment and customer focus practices, though some inhibitors for TQM implementation such as managerial structure for quality assurance and performance appraisal mechanism are also pointed out by the study. Research limitations: The study findings are only based on two case studies and one focus group, which is not enough to generalize the findings to all social organizations. Practical Implications: Identified TQM enablers can help management to implement TQM successfully in social enterprises. Social implications: The study provides enabling path for Social enterprises to implement TQM to seek quality output to build a better society.

Keywords: TQM, social enterprise, enablers of TQM, UAE

Procedia PDF Downloads 78
30401 Quality and Quantity in the Strategic Network of Higher Education Institutions

Authors: Juha Kettunen

Abstract:

This study analyzes the quality and the size of the strategic network of higher education institutions. The study analyses the concept of fitness for purpose in quality assurance. It also analyses the transaction costs of networking that have consequences on the number of members in the network. Empirical evidence is presented of the Consortium on Applied Research and Professional Education, which is a European strategic network of six higher education institutions. The results of the study support the argument that the number of members in the strategic network should be relatively small to provide high quality results. The practical importance is that networking has been able to promote international research and development projects. The results of this study are important for those who want to design and improve international networks in higher education.

Keywords: balanced scorecard, higher education, social networking, strategic planning

Procedia PDF Downloads 320
30400 Study and Improvement of the Quality of a Production Line

Authors: S. Bouchami, M.N. Lakhoua

Abstract:

The automotive market is a dynamic market that continues to grow. That’s why several companies belonging to this sector adopt a quality improvement approach. Wanting to be competitive and successful in the environment in which they operate, these companies are dedicated to establishing a system of quality management to ensure the achievement of the objective quality, improving the products and process as well as the satisfaction of the customers. In this paper, the management of the quality and the improvement of a production line in an industrial company is presented. In fact, the project is divided into two essential parts: the creation of the technical line documentation and the quality assurance documentation and the resolution of defects at the line, as well as those claimed by the customer. The creation of the documents has required a deep understanding of the manufacturing process. The analysis and problem solving were done through the implementation of PDCA (Plan Do Check Act) and FTA (Fault Tree Analysis). As perspective, in order to better optimize production and improve the efficiency of the production line, a study on the problems associated with the supply of raw materials should be made to solve the problems of stock-outs which cause delays penalizing for the industrial company.

Keywords: quality management, documentary system, Plan Do Check Act (PDCA), fault tree analysis (FTA) method

Procedia PDF Downloads 119
30399 The Higher Education Accreditation Foreign Experience for Ukraine

Authors: Dmytro Symak

Abstract:

The experience in other countries shows that, the role of accreditation of higher education as one of the types of quality assurance process for providing educational services increases. This was the experience of highly developed countries such as USA, Canada, France, Germany, because without proper quality assurance process is impossible to achieve a successful future of the nation and the state. In most countries, the function of Higher Education Accreditation performs public authorities, in particular, such as the Ministry of Education. In the US, however, the quality assurance process is independent on the government and implemented by private non-governmental organization - the Council of Higher Education Accreditation. In France, the main body that carries out accreditation of higher education is the Ministry of National Education. As part of the Bologna process is the mutual recognition and accreditation of degrees. While higher education institutions issue diplomas, but the ministry could award the title. This is the main level of accreditation awarded automatically by state universities. In total, there are in France next major level of accreditation of higher education: - accreditation for a visa: Accreditation second level; - recognition of accreditation: accreditation of third level. In some areas of education to accreditation ministry should adopt formal recommendations on specific organs. But there are also some exceptions. Thus, the French educational institutions, mainly large Business School, looking for non-French accreditation. These include, for example, the Association to Advance Collegiate Schools of Business, the Association of MBAs, the European Foundation for Management Development, the European Quality Improvement System, a prestigious EFMD Programme accreditation system. Noteworthy also German accreditation system of education. The primary here is a Conference of Ministers of Education and Culture of land in the Federal Republic of Germany (Kultusministerkonferenz or CCM) was established in 1948 by agreement between the States of the Federal Republic of Germany. Among its main responsibilities is to ensure quality and continuity of development in higher education. In Germany, the program of bachelors and masters must be accredited in accordance with Resolution Kultusministerkonerenz. In Ukraine Higher Education Accreditation carried out the Ministry of Education, Youth and Sports of Ukraine under four main levels. Ukraine's legislation on higher education based on the Constitution Ukraine consists of the laws of Ukraine ‘On osvititu’ ‘On scientific and technical activity’, ‘On Higher osvititu’ and other legal acts and is entirely within the competence of the state. This leads to considerable centralization and bureaucratization of the process. Thus, analysis of expertise shined can conclude that reforming the system of accreditation and quality of higher education in Ukraine to its integration into the global space requires solving a number of problems in the following areas: improving the system of state certification and licensing; optimizing the network of higher education institutions; creating both governmental and non-governmental organizations to monitor the process of higher education in Ukraine and so on.

Keywords: higher education, accreditation, decentralization, education institutions

Procedia PDF Downloads 313
30398 The Validation of RadCalc for Clinical Use: An Independent Monitor Unit Verification Software

Authors: Junior Akunzi

Abstract:

In the matter of patient treatment planning quality assurance in 3D conformational therapy (3D-CRT) and volumetric arc therapy (VMAT or RapidArc), the independent monitor unit verification calculation (MUVC) is an indispensable part of the process. Concerning 3D-CRT treatment planning, the MUVC can be performed manually applying the standard ESTRO formalism. However, due to the complex shape and the amount of beams in advanced treatment planning technic such as RapidArc, the manual independent MUVC is inadequate. Therefore, commercially available software such as RadCalc can be used to perform the MUVC in complex treatment planning been. Indeed, RadCalc (version 6.3 LifeLine Inc.) uses a simplified Clarkson algorithm to compute the dose contribution for individual RapidArc fields to the isocenter. The purpose of this project is the validation of RadCalc in 3D-CRT and RapidArc for treatment planning dosimetry quality assurance at Antoine Lacassagne center (Nice, France). Firstly, the interfaces between RadCalc and our treatment planning systems (TPS) Isogray (version 4.2) and Eclipse (version13.6) were checked for data transfer accuracy. Secondly, we created test plans in both Isogray and Eclipse featuring open fields, wedges fields, and irregular MLC fields. These test plans were transferred from TPSs according to the radiotherapy protocol of DICOM RT to RadCalc and the linac via Mosaiq (version 2.5). Measurements were performed in water phantom using a PTW cylindrical semiflex ionisation chamber (0.3 cm³, 31010) and compared with the TPSs and RadCalc calculation. Finally, 30 3D-CRT plans and 40 RapidArc plans created with patients CT scan were recalculated using the CT scan of a solid PMMA water equivalent phantom for 3D-CRT and the Octavius II phantom (PTW) CT scan for RapidArc. Next, we measure the doses delivered into these phantoms for each plan with a 0.3 cm³ PTW 31010 cylindrical semiflex ionisation chamber (3D-CRT) and 0.015 cm³ PTW PinPoint ionisation chamber (Rapidarc). For our test plans, good agreements were found between calculation (RadCalc and TPSs) and measurement (mean: 1.3%; standard deviation: ± 0.8%). Regarding the patient plans, the measured doses were compared to the calculation in RadCalc and in our TPSs. Moreover, RadCalc calculations were compared to Isogray and Eclispse ones. Agreements better than (2.8%; ± 1.2%) were found between RadCalc and TPSs. As for the comparison between calculation and measurement the agreement for all of our plans was better than (2.3%; ± 1.1%). The independent MU verification calculation software RadCal has been validated for clinical use and for both 3D-CRT and RapidArc techniques. The perspective of this project includes the validation of RadCal for the Tomotherapy machine installed at centre Antoine Lacassagne.

Keywords: 3D conformational radiotherapy, intensity modulated radiotherapy, monitor unit calculation, dosimetry quality assurance

Procedia PDF Downloads 190
30397 Introduction of Electronic Health Records to Improve Data Quality in Emergency Department Operations

Authors: Anuruddha Jagoda, Samiddhi Samarakoon, Anil Jasinghe

Abstract:

In its simplest form, data quality can be defined as 'fitness for use' and it is a concept with multi-dimensions. Emergency Departments(ED) require information to treat patients and on the other hand it is the primary source of information regarding accidents, injuries, emergencies etc. Also, it is the starting point of various patient registries, databases and surveillance systems. This interventional study was carried out to improve data quality at the ED of the National Hospital of Sri Lanka (NHSL) by introducing an e health solution to improve data quality. The NHSL is the premier trauma care centre in Sri Lanka. The study consisted of three components. A research study was conducted to assess the quality of data in relation to selected five dimensions of data quality namely accuracy, completeness, timeliness, legibility and reliability. The intervention was to develop and deploy an electronic emergency department information system (eEDIS). Post assessment of the intervention confirmed that all five dimensions of data quality had improved. The most significant improvements are noticed in accuracy and timeliness dimensions.

Keywords: electronic health records, electronic emergency department information system, emergency department, data quality

Procedia PDF Downloads 250
30396 Program Level Learning Outcomes in Music and Technology: Toward Improved Assessment and Better Communication

Authors: Susan Lewis

Abstract:

The assessment of learning outcomes at the program level has attracted much international interest from the perspectives of quality assurance and ongoing curricular redesign and renewal. This paper examines program-level learning outcomes in the field of music and technology, an area of study that has seen an explosion in program development over the past fifteen years. The Audio Engineering Society (AES) maintains an online directory of educational institutions worldwide, yielding the most comprehensive inventory of programs and courses in music and technology. The inventory includes courses, programs, and degrees in music and technology, music and computer science, music production, and the music industry. This paper focuses on published student learning outcomes for undergraduate degrees in music and technology and analyses commonalities at institutions in North America, the United Kingdom, and Europe. The results of a survey of student learning outcomes at twenty institutions indicates a focus on three distinct student learning outcomes: (1) cross-disciplinary knowledge in the fields of music and technology; (2) the practical application of training through the professional industry; and (3) the acquisition of skills in communication and collaboration. The paper then analyses assessment mechanisms for tracking student learning and achievement of learning outcomes at these institutions. The results indicate highly variable assessment practices. Conclusions offer recommendations for enhancing assessment techniques and better communicating learning outcomes to students.

Keywords: quality assurance, student learning; learning outcomes, music and technology

Procedia PDF Downloads 150
30395 Healthcare Service Quality in Indian Context

Authors: Ganesh Nivrutti Akhade

Abstract:

This paper attempts to develop a reliable and valid instrument of measuring Healthcare service quality in India, and also analyses the impact of demographic factor of respondent on healthcare service quality. In this research paper , extant literature survey, discussion with stakeholder of healthcare system such as patients, patients relative, administrators of hospitals, clinics, professionals and expert interviews were used to develop a attributes of healthcare service quality dimensions. A pilot study was conducted with a sample of 31 healthcare patients of private sector, public sector ,trust hospital ,primary health care centers and clinics was surveyed in the Nagpur Metropolitan Area. At the end fifteen dimensions—reliability, assurance, responsiveness, tangibility, empathy, affordability, respect, and caring, Attitude of staff, Technical competence, Appropriateness, Safety, continuity, Effectiveness, Availability, Financial support. This fifteen-dimensional model was validated through a content validity and construct validity. The proposed research model shows acceptable fit indices. Impact of these dimensions on the Overall Healthcare Service Quality and customer satisfaction are analyzed using multiple regression technique. Findings indicate that all dimensions carry significant impact on the Overall Healthcare Service Quality perceptions and customer satisfaction. However, availability and effectiveness dimensions carry the maximum impact on the Overall healthcare Service Quality .

Keywords: healthcare, service quality, factor analysis (CFA), india, service quality dimensions

Procedia PDF Downloads 252
30394 The Investigate Relationship between Moral Hazard and Corporate Governance with Earning Forecast Quality in the Tehran Stock Exchange

Authors: Fatemeh Rouhi, Hadi Nassiri

Abstract:

Earning forecast is a key element in economic decisions but there are some situations, such as conflicts of interest in financial reporting, complexity and lack of direct access to information has led to the phenomenon of information asymmetry among individuals within the organization and external investors and creditors that appear. The adverse selection and moral hazard in the investor's decision and allows direct assessment of the difficulties associated with data by users makes. In this regard, the role of trustees in corporate governance disclosure is crystallized that includes controls and procedures to ensure the lack of movement in the interests of the company's management and move in the direction of maximizing shareholder and company value. Therefore, the earning forecast of companies in the capital market and the need to identify factors influencing this study was an attempt to make relationship between moral hazard and corporate governance with earning forecast quality companies operating in the capital market and its impact on Earnings Forecasts quality by the company to be established. Getting inspiring from the theoretical basis of research, two main hypotheses and sub-hypotheses are presented in this study, which have been examined on the basis of available models, and with the use of Panel-Data method, and at the end, the conclusion has been made at the assurance level of 95% according to the meaningfulness of the model and each independent variable. In examining the models, firstly, Chow Test was used to specify either Panel Data method should be used or Pooled method. Following that Housman Test was applied to make use of Random Effects or Fixed Effects. Findings of the study show because most of the variables are positively associated with moral hazard with earnings forecasts quality, with increasing moral hazard, earning forecast quality companies listed on the Tehran Stock Exchange is increasing. Among the variables related to corporate governance, board independence variables have a significant relationship with earnings forecast accuracy and earnings forecast bias but the relationship between board size and earnings forecast quality is not statistically significant.

Keywords: corporate governance, earning forecast quality, moral hazard, financial sciences

Procedia PDF Downloads 293
30393 Multi-Label Approach to Facilitate Test Automation Based on Historical Data

Authors: Warda Khan, Remo Lachmann, Adarsh S. Garakahally

Abstract:

The increasing complexity of software and its applicability in a wide range of industries, e.g., automotive, call for enhanced quality assurance techniques. Test automation is one option to tackle the prevailing challenges by supporting test engineers with fast, parallel, and repetitive test executions. A high degree of test automation allows for a shift from mundane (manual) testing tasks to a more analytical assessment of the software under test. However, a high initial investment of test resources is required to establish test automation, which is, in most cases, a limitation to the time constraints provided for quality assurance of complex software systems. Hence, a computer-aided creation of automated test cases is crucial to increase the benefit of test automation. This paper proposes the application of machine learning for the generation of automated test cases. It is based on supervised learning to analyze test specifications and existing test implementations. The analysis facilitates the identification of patterns between test steps and their implementation with test automation components. For the test case generation, this approach exploits historical data of test automation projects. The identified patterns are the foundation to predict the implementation of unknown test case specifications. Based on this support, a test engineer solely has to review and parameterize the test automation components instead of writing them manually, resulting in a significant time reduction for establishing test automation. Compared to other generation approaches, this ML-based solution can handle different writing styles, authors, application domains, and even languages. Furthermore, test automation tools require expert knowledge by means of programming skills, whereas this approach only requires historical data to generate test cases. The proposed solution is evaluated using various multi-label evaluation criteria (EC) and two small-sized real-world systems. The most prominent EC is ‘Subset Accuracy’. The promising results show an accuracy of at least 86% for test cases, where a 1:1 relationship (Multi-Class) between test step specification and test automation component exists. For complex multi-label problems, i.e., one test step can be implemented by several components, the prediction accuracy is still at 60%. It is better than the current state-of-the-art results. It is expected the prediction quality to increase for larger systems with respective historical data. Consequently, this technique facilitates the time reduction for establishing test automation and is thereby independent of the application domain and project. As a work in progress, the next steps are to investigate incremental and active learning as additions to increase the usability of this approach, e.g., in case labelled historical data is scarce.

Keywords: machine learning, multi-class, multi-label, supervised learning, test automation

Procedia PDF Downloads 97
30392 Addressing Supply Chain Data Risk with Data Security Assurance

Authors: Anna Fowler

Abstract:

When considering assets that may need protection, the mind begins to contemplate homes, cars, and investment funds. In most cases, the protection of those assets can be covered through security systems and insurance. Data is not the first thought that comes to mind that would need protection, even though data is at the core of most supply chain operations. It includes trade secrets, management of personal identifiable information (PII), and consumer data that can be used to enhance the overall experience. Data is considered a critical element of success for supply chains and should be one of the most critical areas to protect. In the supply chain industry, there are two major misconceptions about protecting data: (i) We do not manage or store confidential/personally identifiable information (PII). (ii) Reliance on Third-Party vendor security. These misconceptions can significantly derail organizational efforts to adequately protect data across environments. These statistics can be exciting yet overwhelming at the same time. The first misconception, “We do not manage or store confidential/personally identifiable information (PII)” is dangerous as it implies the organization does not have proper data literacy. Enterprise employees will zero in on the aspect of PII while neglecting trade secret theft and the complete breakdown of information sharing. To circumvent the first bullet point, the second bullet point forges an ideology that “Reliance on Third-Party vendor security” will absolve the company from security risk. Instead, third-party risk has grown over the last two years and is one of the major causes of data security breaches. It is important to understand that a holistic approach should be considered when protecting data which should not involve purchasing a Data Loss Prevention (DLP) tool. A tool is not a solution. To protect supply chain data, start by providing data literacy training to all employees and negotiating the security component of contracts with vendors to highlight data literacy training for individuals/teams that may access company data. It is also important to understand the origin of the data and its movement to include risk identification. Ensure processes effectively incorporate data security principles. Evaluate and select DLP solutions to address specific concerns/use cases in conjunction with data visibility. These approaches are part of a broader solutions framework called Data Security Assurance (DSA). The DSA Framework looks at all of the processes across the supply chain, including their corresponding architecture and workflows, employee data literacy, governance and controls, integration between third and fourth-party vendors, DLP as a solution concept, and policies related to data residency. Within cloud environments, this framework is crucial for the supply chain industry to avoid regulatory implications and third/fourth party risk.

Keywords: security by design, data security architecture, cybersecurity framework, data security assurance

Procedia PDF Downloads 63
30391 History of Film in the (West/South) Africa-the Emergence of the Film Production Economy

Authors: Sibusiso Mnyanda

Abstract:

Storytelling through motion pictures is a valuable economy. South Africa was one of the first countries in the world to see and hear sound motion pictures With Lingards Waxworks in Durban first showing them in August 1895. This article celebrates and takes a microscopic look into the developments of this industry and its economy, highlighting these fundamentals: Skill levels and talent sets that were displayed in this emergence, the quality of the products that were produced by filmmakers and actors, the level of Administration and quality assurance of production houses and the general infrastructure and resources available to the industry at the time.

Keywords: film, Africa, production economy, history

Procedia PDF Downloads 36
30390 Data Management System for Environmental Remediation

Authors: Elizaveta Petelina, Anton Sizo

Abstract:

Environmental remediation projects deal with a wide spectrum of data, including data collected during site assessment, execution of remediation activities, and environmental monitoring. Therefore, an appropriate data management is required as a key factor for well-grounded decision making. The Environmental Data Management System (EDMS) was developed to address all necessary data management aspects, including efficient data handling and data interoperability, access to historical and current data, spatial and temporal analysis, 2D and 3D data visualization, mapping, and data sharing. The system focuses on support of well-grounded decision making in relation to required mitigation measures and assessment of remediation success. The EDMS is a combination of enterprise and desktop level data management and Geographic Information System (GIS) tools assembled to assist to environmental remediation, project planning, and evaluation, and environmental monitoring of mine sites. EDMS consists of seven main components: a Geodatabase that contains spatial database to store and query spatially distributed data; a GIS and Web GIS component that combines desktop and server-based GIS solutions; a Field Data Collection component that contains tools for field work; a Quality Assurance (QA)/Quality Control (QC) component that combines operational procedures for QA and measures for QC; Data Import and Export component that includes tools and templates to support project data flow; a Lab Data component that provides connection between EDMS and laboratory information management systems; and a Reporting component that includes server-based services for real-time report generation. The EDMS has been successfully implemented for the Project CLEANS (Clean-up of Abandoned Northern Mines). Project CLEANS is a multi-year, multimillion-dollar project aimed at assessing and reclaiming 37 uranium mine sites in northern Saskatchewan, Canada. The EDMS has effectively facilitated integrated decision-making for CLEANS project managers and transparency amongst stakeholders.

Keywords: data management, environmental remediation, geographic information system, GIS, decision making

Procedia PDF Downloads 131
30389 Gap Analysis of Service Quality: The Veterinary Teaching Hospital, University of Peradeniya, Sri Lanka

Authors: Preethi Sudarshanie Dassanayake, R. A. Sudath Weerasiri

Abstract:

Objective: The objective of this study were to find out highest expectation and perception,highest gap between perception and expectation of service quality, and to find out such gaps between perception and expectation with regard to service quality dimensions were whether statistically significant. Methodology: This study carried out at the Out Patient Department (OPD) of the Veterinary Teaching Hospital (VTH), University of Peradeniya. Modified version of SERVQUAL with 22-pairs of items regarding expectation and perception of service quality in dimensions of tangible, reliability, responsiveness, assurance and empathy were included in Part 1 and the Part 2 of the questionnaire consisted of questions regarding socio-demographic factors. Sample size was 200 and sampling procedure was Systematic Random Sampling. Customers above 18 years of age, able to read, write and understand Sinhala or English language, visits more than twice in last six months and who willing to respond were selected. Findings: The analysis revealed customers expectations of service higher than the perceived for all 22- items of the SERVQUAL. This high expectation suggests that there is sufficient room for further improvement of service quality in all five dimensions. Originality/Value of the Paper: This study gave a new insight for poorly researched area of veterinary health service quality in Sri Lankan context. It provides hospital administrators and policy makers to develop strategies for further improvement of service quality according to customers' view.

Keywords: expectation, perception, service quality, SERVQUAL, veterinary health care

Procedia PDF Downloads 447
30388 Inclusion and Changes of a Research Criterion in the Institute for Quality and Accreditation of Computing, Engineering and Technology Accreditation Model

Authors: J. Daniel Sanchez Ruiz

Abstract:

The paper explains why and how a research criterion was included within an accreditation system for undergraduate engineering programs, in spite of not being a common practice of accreditation agencies at a global level. This paper is divided into three parts. The first presents the context and the motivations that led the Institute for Quality and Accreditation of Computing, Engineering and Technology Programs (ICACIT) to add a research criterion. The second describes the criterion adopted and the feedback received during 2017 accreditation cycle. The third, the author proposes changes to the accreditation criteria that respond in a pertinent way to the results-based accreditation model and the national context. The author seeks to reconcile an outcome based accreditation model, aligned with the established by the International Engineering Alliance, with the particular context of higher education in Peru.

Keywords: accreditation, engineering education, quality assurance, research

Procedia PDF Downloads 260
30387 Biosensors as Analytical Tools in Legume Processing

Authors: S. V. Ncube, A. I. O. Jideani, E. T. Gwata

Abstract:

The plight of food insecurity in developing countries has led to renewed interest in underutilized legumes. Their nutritional versatility, desirable functionality, pharmaceutical value and inherent bioactive compounds have drawn the attention of researchers. This has provoked the development of value added products with the aim of commercially exploiting their full potential. However processing of these legumes leads to changes in nutritional composition as affected by processing variables like pH, temperature and pressure. There is therefore a need for process control and quality assurance during production of the value added products. However, conventional methods for microbiological and biochemical identification are labour intensive and time-consuming. Biosensors offer rapid and affordable methods to assure the quality of the products. They may be used to quantify nutrients and anti-nutrients in the products while manipulating and monitoring variables such as pH, temperature, pressure and oxygen that affect the quality of the final product. This review gives an overview of the types of biosensors used in the food industry, their advantages and disadvantages and their possible application in processing of legumes.

Keywords: legume processing, biosensors, quality control, nutritional versatility

Procedia PDF Downloads 466
30386 Data Quality as a Pillar of Data-Driven Organizations: Exploring the Benefits of Data Mesh

Authors: Marc Bachelet, Abhijit Kumar Chatterjee, José Manuel Avila

Abstract:

Data quality is a key component of any data-driven organization. Without data quality, organizations cannot effectively make data-driven decisions, which often leads to poor business performance. Therefore, it is important for an organization to ensure that the data they use is of high quality. This is where the concept of data mesh comes in. Data mesh is an organizational and architectural decentralized approach to data management that can help organizations improve the quality of data. The concept of data mesh was first introduced in 2020. Its purpose is to decentralize data ownership, making it easier for domain experts to manage the data. This can help organizations improve data quality by reducing the reliance on centralized data teams and allowing domain experts to take charge of their data. This paper intends to discuss how a set of elements, including data mesh, are tools capable of increasing data quality. One of the key benefits of data mesh is improved metadata management. In a traditional data architecture, metadata management is typically centralized, which can lead to data silos and poor data quality. With data mesh, metadata is managed in a decentralized manner, ensuring accurate and up-to-date metadata, thereby improving data quality. Another benefit of data mesh is the clarification of roles and responsibilities. In a traditional data architecture, data teams are responsible for managing all aspects of data, which can lead to confusion and ambiguity in responsibilities. With data mesh, domain experts are responsible for managing their own data, which can help provide clarity in roles and responsibilities and improve data quality. Additionally, data mesh can also contribute to a new form of organization that is more agile and adaptable. By decentralizing data ownership, organizations can respond more quickly to changes in their business environment, which in turn can help improve overall performance by allowing better insights into business as an effect of better reports and visualization tools. Monitoring and analytics are also important aspects of data quality. With data mesh, monitoring, and analytics are decentralized, allowing domain experts to monitor and analyze their own data. This will help in identifying and addressing data quality problems in quick time, leading to improved data quality. Data culture is another major aspect of data quality. With data mesh, domain experts are encouraged to take ownership of their data, which can help create a data-driven culture within the organization. This can lead to improved data quality and better business outcomes. Finally, the paper explores the contribution of AI in the coming years. AI can help enhance data quality by automating many data-related tasks, like data cleaning and data validation. By integrating AI into data mesh, organizations can further enhance the quality of their data. The concepts mentioned above are illustrated by AEKIDEN experience feedback. AEKIDEN is an international data-driven consultancy that has successfully implemented a data mesh approach. By sharing their experience, AEKIDEN can help other organizations understand the benefits and challenges of implementing data mesh and improving data quality.

Keywords: data culture, data-driven organization, data mesh, data quality for business success

Procedia PDF Downloads 103
30385 Mastering Test Automation: Bridging Gaps for Seamless QA

Authors: Rohit Khankhoje

Abstract:

The rapid evolution of software development practices has given rise to an increasing demand for efficient and effective test automation. The paper titled "Mastering Test Automation: Bridging Gaps for Seamless QA" delves into the crucial aspects of test automation, addressing the obstacles faced by organizations in achieving flawless quality assurance. The paper highlights the importance of bridging knowledge gaps within organizations, emphasizing the necessity for management to acquire a deeper comprehension of test automation scenarios, coverage, report trends, and the importance of communication. To tackle these challenges, this paper introduces innovative solutions, including the development of an automation framework that seamlessly integrates with test cases and reporting tools like TestRail and Jira. This integration facilitates the automatic recording of bugs in Jira, enhancing bug reporting and communication between manual QA and automation teams as well as TestRail have all newly added automated testcases as soon as it is part of the automation suite. The paper demonstrates how this framework empowers management by providing clear insights into ongoing automation activities, bug origins, trend analysis, and test case specifics. "Mastering Test Automation" serves as a comprehensive guide for organizations aiming to enhance their quality assurance processes through effective test automation. It not only identifies the common pitfalls and challenges but also offers practical solutions to bridge the gaps, resulting in a more streamlined and efficient QA process.

Keywords: automation framework, API integration, test automation, test management tools

Procedia PDF Downloads 47
30384 Quality Assurance in Cardiac Disorder Detection Images

Authors: Anam Naveed, Asma Andleeb, Mehreen Sirshar

Abstract:

In the article, Image processing techniques have been applied on cardiac images for enhancing the image quality. Two types of methodologies considers for survey, invasive techniques and non-invasive techniques. Different image processes for improvement of cardiac image quality and reduce the amount of radiation exposure for invasive techniques are explored. Different image processing algorithms for enhancing the noninvasive cardiac image qualities are described. Beside these two methodologies, third methodology has applied on live streaming of heart rate on ECG window for extracting necessary information, removing noise and enhancing quality. Sensitivity analyses have been carried out to investigate the impacts of cardiac images for diagnosis of cardiac arteries disease and how the enhancement on images will help the cardiologist to diagnoses disease. The paper evaluates strengths and weaknesses of different techniques applied for improved the image quality and draw a conclusion. Some specific limitations must be considered for whole survey, like the patient heart beat must be 70-75 beats/minute while doing the angiography, similarly patient weight and exposure radiation amount has some limitation.

Keywords: cardiac images, CT angiography, critical analysis, exposure radiation, invasive techniques, invasive techniques, non-invasive techniques

Procedia PDF Downloads 319
30383 Immunization-Data-Quality in Public Health Facilities in the Pastoralist Communities: A Comparative Study Evidence from Afar and Somali Regional States, Ethiopia

Authors: Melaku Tsehay

Abstract:

The Consortium of Christian Relief and Development Associations (CCRDA), and the CORE Group Polio Partners (CGPP) Secretariat have been working with Global Alliance for Vac-cines and Immunization (GAVI) to improve the immunization data quality in Afar and Somali Regional States. The main aim of this study was to compare the quality of immunization data before and after the above interventions in health facilities in the pastoralist communities in Ethiopia. To this end, a comparative-cross-sectional study was conducted on 51 health facilities. The baseline data was collected in May 2019, while the end line data in August 2021. The WHO data quality self-assessment tool (DQS) was used to collect data. A significant improvment was seen in the accuracy of the pentavalent vaccine (PT)1 (p = 0.012) data at the health posts (HP), while PT3 (p = 0.010), and Measles (p = 0.020) at the health centers (HC). Besides, a highly sig-nificant improvment was observed in the accuracy of tetanus toxoid (TT)2 data at HP (p < 0.001). The level of over- or under-reporting was found to be < 8%, at the HP, and < 10% at the HC for PT3. The data completeness was also increased from 72.09% to 88.89% at the HC. Nearly 74% of the health facilities timely reported their respective immunization data, which is much better than the baseline (7.1%) (p < 0.001). These findings may provide some hints for the policies and pro-grams targetting on improving immunization data qaulity in the pastoralist communities.

Keywords: data quality, immunization, verification factor, pastoralist region

Procedia PDF Downloads 70
30382 A Study of Quality Assurance and Unit Verification Methods in Safety Critical Environment

Authors: Miklos Taliga

Abstract:

In the present case study we examined the development and testing methods of systems that contain safety-critical elements in different industrial fields. Consequentially, we observed the classical object-oriented development and testing environment, as both medical technology and automobile industry approaches the development of safety critical elements that way. Subsequently, we examined model-based development. We introduce the quality parameters that define development and testing. While taking modern agile methodology (scrum) into consideration, we examined whether and to what extent the methodologies we found fit into this environment.

Keywords: safety-critical elements, quality managent, unit verification, model base testing, agile methods, scrum, metamodel, object-oriented programming, field specific modelling, sprint, user story, UML Standard

Procedia PDF Downloads 561
30381 Analyzing the Impact of Code Commenting on Software Quality

Authors: Thulya Premathilake, Tharushi Perera, Hansi Thathsarani, Tharushi Nethmini, Dilshan De Silva, Piyumika Samarasekara

Abstract:

One of the most efficient ways to assist developers in grasping the source code is to make use of comments, which can be found throughout the code. When working in fields such as software development, having comments in your code that are of good quality is a fundamental requirement. Tackling software problems while making use of programs that have already been built. It is essential for the intention of the source code to be made crystal apparent in the comments that are added to the code. This assists programmers in better comprehending the programs they are working on and enables them to complete software maintenance jobs in a more timely manner. In spite of the fact that comments and documentation are meant to improve readability and maintainability, the vast majority of programmers place the majority of their focus on the actual code that is being written. This study provides a complete and comprehensive overview of the previous research that has been conducted on the topic of code comments. The study focuses on four main topics, including automated comment production, comment consistency, comment classification, and comment quality rating. One is able to get the knowledge that is more complete for use in following inquiries if they conduct an analysis of the proper approaches that were used in this study issue.

Keywords: code commenting, source code, software quality, quality assurance

Procedia PDF Downloads 62
30380 A Techno-Economic Simulation Model to Reveal the Relevance of Construction Process Impact Factors for External Thermal Insulation Composite System (ETICS)

Authors: Virgo Sulakatko

Abstract:

The reduction of energy consumption of the built environment has been one of the topics tackled by European Commission during the last decade. Increased energy efficiency requirements have increased the renovation rate of apartment buildings covered with External Thermal Insulation Composite System (ETICS). Due to fast and optimized application process, a large extent of quality assurance is depending on the specific activities of artisans and are often not controlled. The on-site degradation factors (DF) have the technical influence to the façade and cause future costs to the owner. Besides the thermal conductivity, the building envelope needs to ensure the mechanical resistance and stability, fire-, noise-, corrosion and weather protection, and long-term durability. As the shortcomings of the construction phase become problematic after some years, the common value of the renovation is reduced. Previous work on the subject has identified and rated the relevance of DF to the technical requirements and developed a method to reveal the economic value of repair works. The future costs can be traded off to increased the quality assurance during the construction process. The proposed framework is describing the joint simulation of the technical importance and economic value of the on-site DFs of ETICS. The model is providing new knowledge to improve the resource allocation during the construction process by enabling to identify and diminish the most relevant degradation factors and increase economic value to the owner.

Keywords: ETICS, construction technology, construction management, life cycle costing

Procedia PDF Downloads 398
30379 Analyzing On-Line Process Data for Industrial Production Quality Control

Authors: Hyun-Woo Cho

Abstract:

The monitoring of industrial production quality has to be implemented to alarm early warning for unusual operating conditions. Furthermore, identification of their assignable causes is necessary for a quality control purpose. For such tasks many multivariate statistical techniques have been applied and shown to be quite effective tools. This work presents a process data-based monitoring scheme for production processes. For more reliable results some additional steps of noise filtering and preprocessing are considered. It may lead to enhanced performance by eliminating unwanted variation of the data. The performance evaluation is executed using data sets from test processes. The proposed method is shown to provide reliable quality control results, and thus is more effective in quality monitoring in the example. For practical implementation of the method, an on-line data system must be available to gather historical and on-line data. Recently large amounts of data are collected on-line in most processes and implementation of the current scheme is feasible and does not give additional burdens to users.

Keywords: detection, filtering, monitoring, process data

Procedia PDF Downloads 525
30378 Quality Management in Spice Paprika Production as a Synergy of Internal and External Quality Measures

Authors: É. Kónya, E. Szabó, I. Bata-Vidács, T. Deák, M. Ottucsák, N. Adányi, A. Székács

Abstract:

Spice paprika is a major spice commodity in the European Union (EU), produced locally and imported from non-EU countries, reported not only for chemical and microbiological contamination, but also for fraud. The effective interaction between producers’ quality management practices and government and EU activities is described on the example of spice paprika production and control in Hungary, a country of leading spice paprika producer and per capita consumer in Europe. To demonstrate the importance of various contamination factors in the Hungarian production and EU trade of spice paprika, several aspects concerning food safety of this commodity are presented. Alerts in the Rapid Alert System for Food and Feed (RASFF) of the EU between 2005 and 2013, as well as Hungarian state inspection results on spice paprika in 2004 are discussed, and quality non-compliance claims regarding spice paprika among EU member states are summarized in by means of network analysis. Quality assurance measures established along the spice paprika production technology chain at the leading Hungarian spice paprika manufacturer, Kalocsai Fűszerpaprika Zrt. are surveyed with main critical control points identified. The structure and operation of the Hungarian state food safety inspection system is described. Concerted performance of the latter two quality management systems illustrates the effective interaction between internal (manufacturer) and external (state) quality control measures.

Keywords: spice paprika, quality control, reporting mechanisms, RASFF, vulnerable points, HACCP

Procedia PDF Downloads 262
30377 Development of Sleep Quality Index Using Heart Rate

Authors: Dongjoo Kim, Chang-Sik Son, Won-Seok Kang

Abstract:

Adequate sleep affects various parts of one’s overall physical and mental life. As one of the methods in determining the appropriate amount of sleep, this research presents a heart rate based sleep quality index. In order to evaluate sleep quality using the heart rate, sleep data from 280 subjects taken over one month are used. Their sleep data are categorized by a three-part heart rate range. After categorizing, some features are extracted, and the statistical significances are verified for these features. The results show that some features of this sleep quality index model have statistical significance. Thus, this heart rate based sleep quality index may be a useful discriminator of sleep.

Keywords: sleep, sleep quality, heart rate, statistical analysis

Procedia PDF Downloads 313
30376 Impact of Graduates’ Quality of Education and Research on ICT Adoption at Workplace

Authors: Mohammed Kafaji

Abstract:

This paper aims to investigate the influence of quality of education and quality of research, provided by local educational institutions, on the adoption of Information and Communication Technology (ICT) in managing business operations for companies in Saudi market. A model was developed and tested using data collected from 138 CEO’s of foreign companies in diverse business sectors. The data is analysed and managed using multivariate approaches through standard statistical packages. The results showed that educational quality has little contribution to the ICT adoption while research quality seems to play a more prominent role. These results are analysed in terms of business environment and market constraints and further extended to the perceived effectiveness of applied pedagogical approaches in schools and universities.

Keywords: quality of education, quality of research, mediation, domestic competition, ICT adoption

Procedia PDF Downloads 432
30375 Data Quality on Regular Immunization Programme at Birkod District: Somali Region, Ethiopia

Authors: Eyob Seife, Tesfalem Teshome, Bereket Seyoum, Behailu Getachew, Yohans Demis

Abstract:

Developing countries continue to face preventable communicable diseases, such as vaccine-preventable diseases. The Expanded Programme on Immunization (EPI) was established by the World Health Organization in 1974 to control these diseases. Health data use is crucial in decision-making, but ensuring data quality remains challenging. The study aimed to assess the accuracy ratio, timeliness, and quality index of regular immunization programme data in the Birkod district of the Somali Region, Ethiopia. For poor data quality, technical, contextual, behavioral, and organizational factors are among contributors. The study used a quantitative cross-sectional design conducted in September 2022GC using WHO-recommended data quality self-assessment tools. The accuracy ratio and timeliness of reports on regular immunization programmes were assessed for two health centers and three health posts in the district for one fiscal year. Moreover, the quality index assessment was conducted at the district level and health facilities by trained assessors. The study found poor data quality in the accuracy ratio and timeliness of reports at all health units, which includes zeros. Overreporting was observed for most facilities, particularly at the health post level. Health centers showed a relatively better accuracy ratio than health posts. The quality index assessment revealed poor quality at all levels. The study recommends that responsible bodies at different levels improve data quality using various approaches, such as the capacitation of health professionals and strengthening the quality index components. The study highlighted the need for attention to data quality in general, specifically at the health post level, and improving the quality index at all levels, which is essential.

Keywords: Birkod District, data quality, quality index, regular immunization programme, Somali Region-Ethiopia

Procedia PDF Downloads 50
30374 A New Approach – A Numerical Assessment of Ground Strata Failure Potentials in Underground Mines

Authors: Omer Yeni

Abstract:

Ground strata failure or fall-of-ground is one of the underground mines' most prominent catastrophic risks. Mining companies use various methods/technics to prevent and critically control the associated risks. Some of those are safety by design, excavation methods, ground support, training, and competency, which all require quality control and assurance activities to confirm their efficiencies and performances and identify improvement opportunities through monitoring. However, many mining companies use quality control (QC) methods without quality assurance (QA), and they call it QA/QC together as a habit. From a simple definition, QC is a method of detecting defects, and QA is a method of preventing defects. Testing the final products at the end of the production line is not the way of proper QA/QC application but testing every component before assembly and the final product once completed. The installed ground support elements are some final products mining companies use to prevent ground strata failure. Testing the final product (i.e., rock bolt pull testing, shotcrete strength test, etc.) with QC methods only while those areas are already accessible; is not like testing an airplane full of passengers right after the production line or testing a car after the sale. Can only QC methods be called QA/QC? Can QA/QC activities be numerically scored for each critical control implemented to assess ground strata failure potential? Can numerical scores be used to identify Geotechnical Risk Rating (GRR) to determine the ground strata failure risk and its probability? This paper sets out to provide a specific QA/QC methodology to manage and confirm efficiencies and performances of the implemented critical controls and a numerical approach through the Geotechnical Risk Rating (GRR) process to assess ground strata failure risk to determine the gaps where proactive action is required to evaluate the probability of ground strata failures in underground mines.

Keywords: fall of ground, ground strata failure, QA/QC, underground

Procedia PDF Downloads 48
30373 Impact of Safety and Quality Considerations of Housing Clients on the Construction Firms’ Intention to Adopt Quality Function Deployment: A Case of Construction Sector

Authors: Saif Ul Haq

Abstract:

The current study intends to examine the safety and quality considerations of clients of housing projects and their impact on the adoption of Quality Function Deployment (QFD) by the construction firm. Mixed method research technique has been used to collect and analyze the data wherein a survey was conducted to collect the data from 220 clients of housing projects in Saudi Arabia. Then, the telephonic and Skype interviews were conducted to collect data of 15 professionals working in the top ten real estate companies of Saudi Arabia. Data were analyzed by using partial least square (PLS) and thematic analysis techniques. Findings reveal that today’s customer prioritizes the safety and quality requirements of their houses and as a result, construction firms adopt QFD to address the needs of customers. The findings are of great importance for the clients of housing projects as well as for the construction firms as they could apply QFD in housing projects to address the safety and quality concerns of their clients.

Keywords: construction industry, quality considerations, quality function deployment, safety considerations

Procedia PDF Downloads 101