Search results for: process evaluation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 20018

Search results for: process evaluation

18818 Behaviour of Reinforced Concrete Infilled Frames under Seismic Loads

Authors: W. Badla

Abstract:

A significant portion of the buildings constructed in Algeria is structural frames with infill panels which are usually considered as non structural components and are neglected in the analysis. However, these masonry panels tend to influence the structural response. Thus, these structures can be regarded as seismic risk buildings, although in the Algerian seismic code there is little guidance on the seismic evaluation of infilled frame buildings. In this study, three RC frames with 2, 4, and 8 story and subjected to three recorded Algerian accelerograms are studied. The diagonal strut approach is adopted for modeling the infill panels and a fiber model is used to model RC members. This paper reports on the seismic evaluation of RC frames with brick infill panels. The results obtained show that the masonry panels enhance the load lateral capacity of the buildings and the infill panel configuration influences the response of the structures.

Keywords: seismic design, RC frames, infill panels, non linear dynamic analysis

Procedia PDF Downloads 541
18817 Modified Fuzzy Delphi Method to Incorporate Healthcare Stakeholders’ Perspectives in Selecting Quality Improvement Projects’ Criteria

Authors: Alia Aldarmaki, Ahmad Elshennawy

Abstract:

There is a global shift in healthcare systems’ emphasizing engaging different stakeholders in selecting quality improvement initiatives and incorporating their preferences to improve the healthcare efficiency and outcomes. Although experts bring scientific knowledge based on the scientific model and their personal experience, other stakeholders can bring new insights and information into the decision-making process. This study attempts to explore the impact of incorporating different stakeholders’ preference in identifying the most significant criteria that should be considered in healthcare for electing the improvement projects. A Framework based on a modified Fuzzy Delphi Method (FDM) was built. In addition to, the subject matter experts, doctors/physicians, nurses, administrators, and managers groups contribute to the selection process. The research identifies potential criteria for evaluating projects in healthcare, then utilizes FDM to capture expertise knowledge. The first round in FDM is intended to validate the identified list of criteria from experts; which includes collecting additional criteria from experts that the literature might have overlooked. When an acceptable level of consensus has been reached, a second round is conducted to obtain experts’ and other related stakeholders’ opinions on the appropriate weight of each criterion’s importance using linguistic variables. FDM analyses eliminate or retain the criteria to produce a final list of the critical criteria to select improvement projects in healthcare. Finally, reliability and validity were investigated using Cronbach’s alpha and factor analysis, respectively. Two case studies were carried out in a public hospital in the United Arab Emirates to test the framework. Both cases demonstrate that even though there were common criteria between the experts and the stakeholders, still stakeholders’ perceptions bring additional critical criteria into the evaluation process, which can impact the outcomes. Experts selected criteria related to strategical and managerial aspects, while the other participants preferred criteria related to social aspects such as health and safety and patients’ satisfaction. The health and safety criterion had the highest important weight in both cases. The analysis showed that Cronbach’s alpha value is 0.977 and all criteria have factor loading greater than 0.3. In conclusion, the inclusion of stakeholders’ perspectives is intended to enhance stakeholders’ engagement, improve transparency throughout the decision process, and take robust decisions.

Keywords: Fuzzy Delphi Method, fuzzy number, healthcare, stakeholders

Procedia PDF Downloads 121
18816 Content and Langauge Integrated Learning: English and Art History

Authors: Craig Mertens

Abstract:

Teaching art history or any other academic subject to EFL students can be done successfully. A course called Western Images was created to teach Japanese students art history while only using English in the classroom. An approach known as Content and Language Integrated Learning (CLIL) was used as a basis for this course. This paper’s purpose is to state the reasons why learning about art history is important, go through the process of creating content for the course, and suggest multiple tasks to help students practice the critical thinking skills used in analyzing and drawing conclusions of works of art from western culture. As a guide for this paper, Brown’s (1995) six elements of a language curriculum will be used. These stages include needs analysis, goals and objectives, assessment, materials, teaching method and tasks, and evaluation of the course. The goal here is to inspire debate and discussion regarding CLIL and its pros and cons, and to question current curriculum in university language courses.

Keywords: art history, EFL, content and language integration learning, critical thinking

Procedia PDF Downloads 594
18815 Evaluation of Mechanical Properties and Surface Roughness of Nanofilled and Microhybrid Composites

Authors: Solmaz Eskandarion, Haniyeh Eftekhar, Amin Fallahi

Abstract:

Introduction: Nowadays cosmetic dentistry has gained greater attention because of the changing demands of dentistry patients. Composite resin restorations play an important role in the field of esthetic restorations. Due to the variation between the resin composites, it is important to be aware of their mechanical properties and surface roughness. So, the aim of this study was to compare the mechanical properties (surface hardness, compressive strength, diametral tensile strength) and surface roughness of four kinds of resin composites after thermal aging process. Materials and Method: 10 samples of each composite resins (Gradia-direct (GC), Filtek Z250 (3M), G-ænial (GC), Filtek Z350 (3M- filtek supreme) prepared for evaluation of each properties (totally 120 samples). Thermocycling (with temperature 5 and 55 degree of centigrade and 10000 cycles) were applied. Then, the samples were tested about their compressive strength and diametral tensile strength using UTM. And surface hardness was evaluated with Microhardness testing machine. Either surface roughness was evaluated with Scanning electron microscope after surface polishing. Result: About compressive strength (CS), Filtek Z250 showed the highest value. But there were not any significant differences between 4 groups about CS. Either Filtek Z250 detected as a composite with highest value of diametral tensile strength (DTS) and after that highest to lowest DTS was related to: Filtek Z350, G-ænial and Gradia-direct. And about DTS all of the groups showed significant differences (P<0.05). Vickers Hardness Number (VHN) of Filtek Z250 was the greatest. After that Filtek Z350, G-ænial and Gradia-direct followed it. The surface roughness of nano-filled composites was less than Microhybrid composites. Either the surface roughness of GC Ganial was a little greater than Filtek Z250. Conclusion: This study indicates that there is not any evident significant difference between the groups amoung their mechanical properties. But it seems that Filtek Z250 showed slightly better mechanical properties. About surface roughness, nanofilled composites were better that Microhybrid.

Keywords: mechanical properties, surface roughness, resin composite, compressive strength, thermal aging

Procedia PDF Downloads 349
18814 Evaluation of a Mindfulness and Self-Care-Based Intervention for Teachers to Enhance Mental Health

Authors: T. Noichl, M. Cramer, G. E. Dlugosch, I. Hosenfeld

Abstract:

Teachers are exposed to a variety of stresses in their work context. These can have a negative impact on physical and psychological well-being. The online training ‘Better Living! Self-care for teachers’ is based on the training ‘Better Living! Self-care for mental health professionals’, which has been proven to be effective over a period of 3 years. The training for teachers is being evaluated for its effectiveness between October 2021 and March 2023 in a study funded by the German Federal Ministry of Education and Research. The aim of the training is to promote self-care and mindfulness among participants and thereby to foster well-being. The concept of self-care was already mentioned in antiquity and was also named as an imperative by philosophers such as Socrates and Epictetus. In the absence of a universal understanding of self-care today, the following definition was developed within the research group: Self-care is 1) facing oneself in a loving and appreciative way, 2) taking one's own needs seriously, and 3) actively contributing to one's own well-being. The study is designed as a randomized wait-control group repeated-measures design with 4 (treatment group) resp. 6 (wait-control group) measurement points. Central dependent variables are self-care, mindfulness, stress, and well-being. To assess the long-term effectiveness of training participation, these constructs are surveyed at the beginning and the end of the training as well as five weeks and one year later. Based on the results of the evaluation with mental health professionals, it is expected that participation will lead to an increase in subjective well-being, self-care, and mindfulness. The first results of the evaluation study are presented and discussed with regard to the effectiveness of the training among teachers.

Keywords: longitudinal intervention study, mindfulness, self-care, teachers’ mental health, well-being

Procedia PDF Downloads 91
18813 Spatial Point Process Analysis of Dengue Fever in Tainan, Taiwan

Authors: Ya-Mei Chang

Abstract:

This research is intended to apply spatio-temporal point process methods to the dengue fever data in Tainan. The spatio-temporal intensity function of the dataset is assumed to be separable. The kernel estimation is a widely used approach to estimate intensity functions. The intensity function is very helpful to study the relation of the spatio-temporal point process and some covariates. The covariate effects might be nonlinear. An nonparametric smoothing estimator is used to detect the nonlinearity of the covariate effects. A fitted parametric model could describe the influence of the covariates to the dengue fever. The correlation between the data points is detected by the K-function. The result of this research could provide useful information to help the government or the stakeholders making decisions.

Keywords: dengue fever, spatial point process, kernel estimation, covariate effect

Procedia PDF Downloads 344
18812 Evaluation of the Skid Resistance of Asphalt Concrete Made of Local Low-Performance Aggregates Based on New Accelerated Polishing Machine

Authors: Saci Abdelhakim Ferkous, Khedoudja Soudani, Smail Haddadi

Abstract:

This paper presents the results of a laboratory experimental study that explores the skid resistance of asphalt concrete mixtures made of local low-performance aggregates by partially replacing sand with olive mill waste (OMW). OMW was mixed with aggregates using a dry process by replacing sand with contents of 5%, 7%, 10% and 15%. The mechanical performances of the mixtures were evaluated using the Marshall and Duriez tests. A modified accelerated polishing machine was used as polishing equipment, and a British pendulum tester (BPT) was used to test the skid resistance of the samples. Finally, texture parameter analysis was performed using scanning electron microscopy (SEM) and Mountains Map software to assess the effect of OMW on the friction coefficient evolution. Using a distinct road wheel for a modified version of an accelerated polishing machine, which is normally used to determine the polished stone value of aggregates, the results showed that the addition of OMW up to 10% conferred a better skid resistance in comparison to normal asphalt concrete. The presence of olive mill waste in the mixture until 15% guarantees a gain of 22%-29% in skid resistance after polishing compared with the reference mix. Indeed, from texture parameter analysis, it was observed that there was differential wear of the lightweight aggregates (OMW) compared to the other aggregates during the polishing process, which created a new surface microtexture that had new peaks and led to a good level of friction compared to the mixtures without OMW. In general, it was found that OMW is a promising modifier for asphalt mixtures with both engineering and economic merits.

Keywords: skid resistance, olive mill waste, polishing resistance, accelerated polishing machine, local materials, sustainable development.

Procedia PDF Downloads 44
18811 Simplified Analysis Procedure for Seismic Evaluation of Tall Building at Structure and Component Level

Authors: Tahir Mehmood, Pennung Warnitchai

Abstract:

Simplified static analysis procedures such Nonlinear Static Procedure (NSP) are gaining popularity for the seismic evaluation of buildings. However, these simplified procedures accounts only for the seismic responses of the fundamental vibration mode of the structure. Some other procedures which can take into account the higher modes of vibration, lack in accuracy to determine the component responses. Hence, such procedures are not suitable for evaluating the structures where many vibration modes may participate significantly or where component responses are needed to be evaluated. Moreover, these procedures were found to either computationally expensive or tedious to obtain individual component responses. In this paper, a simplified but accurate procedure is studied. It is called the Uncoupled Modal Response History Analysis (UMRHA) procedure. In this procedure, the nonlinear response of each vibration mode is first computed, and they are later on combined into the total response of the structure. The responses of four tall buildings are computed by this simplified UMRHA procedure and compared with those obtained from the NLRHA procedure. The comparison shows that the UMRHA procedure is able to accurately compute the global responses, i.e., story shears and story overturning moments, floor accelerations and inter-story drifts as well as the component level responses of these tall buildings with heights varying from 20 to 44 stories. The required computational effort is also extremely low compared to that of the Nonlinear Response History Analysis (NLRHA) procedure.

Keywords: higher mode effects, seismic evaluation procedure, tall buildings, component responses

Procedia PDF Downloads 341
18810 Comparative Evaluation of Pharmacologically Guided Approaches (PGA) to Determine Maximum Recommended Starting Dose (MRSD) of Monoclonal Antibodies for First Clinical Trial

Authors: Ibraheem Husain, Abul Kalam Najmi, Karishma Chester

Abstract:

First-in-human (FIH) studies are a critical step in clinical development of any molecule that has shown therapeutic promise in preclinical evaluations, since preclinical research and safety studies into clinical development is a crucial step for successful development of monoclonal antibodies for guidance in pharmaceutical industry for the treatment of human diseases. Therefore, comparison between USFDA and nine pharmacologically guided approaches (PGA) (simple allometry, maximum life span potential, brain weight, rule of exponent (ROE), two species methods and one species methods) were made to determine maximum recommended starting dose (MRSD) for first in human clinical trials using four drugs namely Denosumab, Bevacizumab, Anakinra and Omalizumab. In our study, the predicted pharmacokinetic (pk) parameters and the estimated first-in-human dose of antibodies were compared with the observed human values. The study indicated that the clearance and volume of distribution of antibodies can be predicted with reasonable accuracy in human and a good estimate of first human dose can be obtained from the predicted human clearance and volume of distribution. A pictorial method evaluation chart was also developed based on fold errors for simultaneous evaluation of various methods.

Keywords: clinical pharmacology (CPH), clinical research (CRE), clinical trials (CTR), maximum recommended starting dose (MRSD), clearance and volume of distribution

Procedia PDF Downloads 368
18809 Open Science Philosophy, Research and Innovation

Authors: C.Ardil

Abstract:

Open Science translates the understanding and application of various theories and practices in open science philosophy, systems, paradigms and epistemology. Open Science originates with the premise that universal scientific knowledge is a product of a collective scholarly and social collaboration involving all stakeholders and knowledge belongs to the global society. Scientific outputs generated by public research are a public good that should be available to all at no cost and without barriers or restrictions. Open Science has the potential to increase the quality, impact and benefits of science and to accelerate advancement of knowledge by making it more reliable, more efficient and accurate, better understandable by society and responsive to societal challenges, and has the potential to enable growth and innovation through reuse of scientific results by all stakeholders at all levels of society, and ultimately contribute to growth and competitiveness of global society. Open Science is a global movement to improve accessibility to and reusability of research practices and outputs. In its broadest definition, it encompasses open access to publications, open research data and methods, open source, open educational resources, open evaluation, and citizen science. The implementation of open science provides an excellent opportunity to renegotiate the social roles and responsibilities of publicly funded research and to rethink the science system as a whole. Open Science is the practice of science in such a way that others can collaborate and contribute, where research data, lab notes and other research processes are freely available, under terms that enable reuse, redistribution and reproduction of the research and its underlying data and methods. Open Science represents a novel systematic approach to the scientific process, shifting from the standard practices of publishing research results in scientific publications towards sharing and using all available knowledge at an earlier stage in the research process, based on cooperative work and diffusing scholarly knowledge with no barriers and restrictions. Open Science refers to efforts to make the primary outputs of publicly funded research results (publications and the research data) publicly accessible in digital format with no limitations. Open Science is about extending the principles of openness to the whole research cycle, fostering, sharing and collaboration as early as possible, thus entailing a systemic change to the way science and research is done. Open Science is the ongoing transition in how open research is carried out, disseminated, deployed, and transformed to make scholarly research more open, global, collaborative, creative and closer to society. Open Science involves various movements aiming to remove the barriers for sharing any kind of output, resources, methods or tools, at any stage of the research process. Open Science embraces open access to publications, research data, source software, collaboration, peer review, notebooks, educational resources, monographs, citizen science, or research crowdfunding. The recognition and adoption of open science practices, including open science policies that increase open access to scientific literature and encourage data and code sharing, is increasing in the open science philosophy. Revolutionary open science policies are motivated by ethical, moral or utilitarian arguments, such as the right to access digital research literature for open source research or science data accumulation, research indicators, transparency in the field of academic practice, and reproducibility. Open science philosophy is adopted primarily to demonstrate the benefits of open science practices. Researchers use open science applications for their own advantage in order to get more offers, increase citations, attract media attention, potential collaborators, career opportunities, donations and funding opportunities. In open science philosophy, open data findings are evidence that open science practices provide significant benefits to researchers in scientific research creation, collaboration, communication, and evaluation according to more traditional closed science practices. Open science considers concerns such as the rigor of peer review, common research facts such as financing and career development, and the sacrifice of author rights. Therefore, researchers are recommended to implement open science research within the framework of existing academic evaluation and incentives. As a result, open science research issues are addressed in the areas of publishing, financing, collaboration, resource management and sharing, career development, discussion of open science questions and conclusions.

Keywords: Open Science, Open Science Philosophy, Open Science Research, Open Science Data

Procedia PDF Downloads 125
18808 Evaluation of Vehicle Classification Categories: Florida Case Study

Authors: Ren Moses, Jaqueline Masaki

Abstract:

This paper addresses the need for accurate and updated vehicle classification system through a thorough evaluation of vehicle class categories to identify errors arising from the existing system and proposing modifications. The data collected from two permanent traffic monitoring sites in Florida were used to evaluate the performance of the existing vehicle classification table. The vehicle data were collected and classified by the automatic vehicle classifier (AVC), and a video camera was used to obtain ground truth data. The Federal Highway Administration (FHWA) vehicle classification definitions were used to define vehicle classes from the video and compare them to the data generated by AVC in order to identify the sources of misclassification. Six types of errors were identified. Modifications were made in the classification table to improve the classification accuracy. The results of this study include the development of updated vehicle classification table with a reduction in total error by 5.1%, a step by step procedure to use for evaluation of vehicle classification studies and recommendations to improve FHWA 13-category rule set. The recommendations for the FHWA 13-category rule set indicate the need for the vehicle classification definitions in this scheme to be updated to reflect the distribution of current traffic. The presented results will be of interest to States’ transportation departments and consultants, researchers, engineers, designers, and planners who require accurate vehicle classification information for planning, designing and maintenance of transportation infrastructures.

Keywords: vehicle classification, traffic monitoring, pavement design, highway traffic

Procedia PDF Downloads 175
18807 Defining Priority Areas for Biodiversity Conservation to Support for Zoning Protected Areas: A Case Study from Vietnam

Authors: Xuan Dinh Vu, Elmar Csaplovics

Abstract:

There has been an increasing need for methods to define priority areas for biodiversity conservation since the effectiveness of biodiversity conservation in protected areas largely depends on the availability of material resources. The identification of priority areas requires the integration of biodiversity data together with social data on human pressures and responses. However, the deficit of comprehensive data and reliable methods becomes a key challenge in zoning where the demand for conservation is most urgent and where the outcomes of conservation strategies can be maximized. In order to fill this gap, the study applied an environmental model Condition–Pressure–Response to suggest a set of criteria to identify priority areas for biodiversity conservation. Our empirical data has been compiled from 185 respondents, categorizing into three main groups: governmental administration, research institutions, and protected areas in Vietnam by using a well - designed questionnaire. Then, the Analytic Hierarchy Process (AHP) theory was used to identify the weight of all criteria. Our results have shown that priority level for biodiversity conservation could be identified by three main indicators: condition, pressure, and response with the value of the weight of 26%, 41%, and 33%, respectively. Based on the three indicators, 7 criteria and 15 sub-criteria were developed to support for defining priority areas for biodiversity conservation and zoning protected areas. In addition, our study also revealed that the groups of governmental administration and protected areas put a focus on the 'Pressure' indicator while the group of Research Institutions emphasized the importance of 'Response' indicator in the evaluation process. Our results provided recommendations to apply the developed criteria for identifying priority areas for biodiversity conservation in Vietnam.

Keywords: biodiversity conservation, condition–pressure–response model, criteria, priority areas, protected areas

Procedia PDF Downloads 161
18806 Organic Matter Removal in Urban and Agroindustry Wastewater by Chemical Precipitation Process

Authors: Karina Santos Silvério, Fátima Carvalho, Maria Adelaide Almeida

Abstract:

The impacts caused by anthropogenic actions on the water environment have been one of the main challenges of modern society. Population growth, added to water scarcity and climate change, points to a need to increase the resilience of production systems to increase efficiency regarding the management of wastewater generated in the different processes. Based on this context, the study developed under the NETA project (New Strategies in Wastewater Treatment) aimed to evaluate the efficiency of the Chemical Precipitation Process (CPP), using the hydrated lime (Ca(OH )₂) as a reagent in wastewater from the agroindustry sector, namely swine wastewater, slaughterhouse and urban wastewater, in order to make the productive means 100% circular, causing a direct positive impact on the environment. The purpose of CPP is to innovate in the field of effluent treatment technologies, as it allows rapid application and is economically profitable. In summary, the study was divided into four main stages: 1) Application of the reagent in a single step, raising the pH to 12.5 2) Obtaining sludge and treated effluent. 3) Natural neutralization of the effluent through Carbonation using atmospheric CO₂. 4) Characterization and evaluation of the feasibility of the chemical precipitation technique in the treatment of different wastewaters through the technique of determining the chemical oxygen demand (COD) and other supporting physical-chemical parameters. The results showed an approximate average removal efficiency above 80% for all effluents, highlighting the swine effluent with 90% removal, followed by urban effluent with 88% and slaughterhouse with 81% on average. Significant improvement was also obtained with regard to color and odor removal after Carbonation to pH 8.00.

Keywords: agroindustry wastewater, urban wastewater, natural carbonatation, chemical precipitation technique

Procedia PDF Downloads 73
18805 The Evaluation of Apricot (Prunus armeniaca L.) Materials Collected from Southeast Anatolia Region of Turkey

Authors: M. Kubilay Önal

Abstract:

The objective of this study was to determine the adaptabilities of native apricot materials collected from Southeast Anatolia region of Turkey to Aegean Region conditions. Different phenological and pomological characteristics of the cultivars were observed during study. Determination of promising types for adaptation trials were performed employing the 'weighed-ranking' method. To determine them the relative points were given to the characteristics such as yield, average fruit weight, attractiveness, soluble solid, seed ratio by weight and aroma. As a result of two-year evaluation studies on the phenological and pomological characteristics of 22 types, 9 out of them, viz., nos. 2235, 2236, 2237, 2239, 2242, 2244, 2246, 2249, 2257 were selected as promising ones.

Keywords: apricot, phenological characters, pomological characters, weight-ranking method

Procedia PDF Downloads 275
18804 Lean Manufacturing Implementation in Fused Plastic Bags Industry

Authors: Tareq Issa

Abstract:

Lean manufacturing is concerned with the implementation of several tools and methodologies that aim for the continuous elimination of wastes throughout manufacturing process flow in the production system. This research addresses the implementation of lean principles and tools in a small-medium industry focusing on 'fused' plastic bags production company in Amman, Jordan. In this production operation, the major type of waste to eliminate include material, waiting-transportation, and setup wastes. The primary goal is to identify and implement selected lean strategies to eliminate waste in the manufacturing process flow. A systematic approach was used for the implementation of lean principles and techniques, through the application of Value Stream Mapping analysis. The current state value stream map was constructed to improve the plastic bags manufacturing process through identifying opportunities to eliminate waste and its sources. Also, the future-state value stream map was developed describing improvements in the overall manufacturing process resulting from eliminating wastes. The implementation of VSM, 5S, Kanban, Kaizen, and Reduced lot size methods have provided significant benefits and results. Productivity has increased to 95.4%, delivery schedule attained at 99-100%, reduction in total inventory to 1.4 days and the setup time for the melting process was reduced to about 30 minutes.

Keywords: lean implementation, plastic bags industry, value stream map, process flow

Procedia PDF Downloads 170
18803 The Using of Smart Power Concepts in Military Targeting Process

Authors: Serdal AKYUZ

Abstract:

The smart power is the use of soft and hard power together in consideration of existing circumstances. Soft power can be defined as the capability of changing perception of any target mass by employing policies based on legality. The hard power, generally, uses military and economic instruments which are the concrete indicator of general power comprehension. More than providing a balance between soft and hard power, smart power creates a proactive combination by assessing existing resources. Military targeting process (MTP), as stated in smart power methodology, benefits from a wide scope of lethal and non-lethal weapons to reach intended end state. The Smart powers components can be used in military targeting process similar to using of lethal or non-lethal weapons. This paper investigates the current use of Smart power concept, MTP and presents a new approach to MTP from smart power concept point of view.

Keywords: future security environment, hard power, military targeting process, soft power, smart power

Procedia PDF Downloads 467
18802 Enhancement of MIMO H₂S Gas Sweetening Separator Tower Using Fuzzy Logic Controller Array

Authors: Muhammad M. A. S. Mahmoud

Abstract:

Natural gas sweetening process is a controlled process that must be done at maximum efficiency and with the highest quality. In this work, due to complexity and non-linearity of the process, the H₂S gas separation and the intelligent fuzzy controller, which is used to enhance the process, are simulated in MATLAB – Simulink. The new design of fuzzy control for Gas Separator is discussed in this paper. The design is based on the utilization of linear state-estimation to generate the internal knowledge-base that stores input-output pairs. The obtained input/output pairs are then used to design a feedback fuzzy controller. The proposed closed-loop fuzzy control system maintains the system asymptotically-stability while it enhances the system time response to achieve better control of the concentration of the output gas from the tower. Simulation studies are carried out to illustrate the Gas Separator system performance.

Keywords: gas separator, gas sweetening, intelligent controller, fuzzy control

Procedia PDF Downloads 461
18801 A Tool for Assessing Performance and Structural Quality of Business Process

Authors: Mariem Kchaou, Wiem Khlif, Faiez Gargouri

Abstract:

Modeling business processes is an essential task when evaluating, improving, or documenting existing business processes. To be efficient in such tasks, a business process model (BPM) must have high structural quality and high performance. Evidently, evaluating the performance of a business process model is a necessary step to reduce time, cost, while assessing the structural quality aims to improve the understandability and the modifiability of the BPMN model. To achieve these objectives, a set of structural and performance measures have been proposed. Since the diversity of measures, we propose a framework that integrates both structural and performance aspects for classifying them. Our measure classification is based on business process model perspectives (e.g., informational, functional, organizational, behavioral, and temporal), and the elements (activity, event, actor, etc.) involved in computing the measures. Then, we implement this framework in a tool assisting the structural quality and the performance of a business process. The tool helps the designers to select an appropriate subset of measures associated with the corresponding perspective and to calculate and interpret their values in order to improve the structural quality and the performance of the model.

Keywords: performance, structural quality, perspectives, tool, classification framework, measures

Procedia PDF Downloads 151
18800 The Use of Artificial Intelligence to Harmonization in the Lawmaking Process

Authors: Supriyadi, Andi Intan Purnamasari, Aminuddin Kasim, Sulbadana, Mohammad Reza

Abstract:

The development of the Industrial Revolution Era 4.0 brought a significant influence in the administration of countries in all parts of the world, including Indonesia, not only in the administration and economic sectors but the ways and methods of forming laws should also be adjusted. Until now, the process of making laws carried out by the Parliament with the Government still uses the classical method. The law-making process still uses manual methods, such as typing harmonization of regulations, so that it is not uncommon for errors to occur, such as writing errors, copying articles and so on, things that require a high level of accuracy and relying on inventory and harmonization carried out manually by humans. However, this method often creates several problems due to errors and inaccuracies on the part of officers who harmonize laws after discussion and approval; this has a very serious impact on the system of law formation in Indonesia. The use of artificial intelligence in the process of forming laws seems to be justified and becomes the answer in order to minimize the disharmony of various laws and regulations. This research is normative research using the Legislative Approach and the Conceptual Approach. This research focuses on the question of how to use Artificial Intelligence for Harmonization in the Lawmaking Process.

Keywords: artificial intelligence, harmonization, laws, intelligence

Procedia PDF Downloads 142
18799 Inadequate Requirements Engineering Process: A Key Factor for Poor Software Development in Developing Nations: A Case Study

Authors: K. Adu Michael, K. Alese Boniface

Abstract:

Developing a reliable and sustainable software products is today a big challenge among up–coming software developers in Nigeria. The inability to develop a comprehensive problem statement needed to execute proper requirements engineering process is missing. The need to describe the ‘what’ of a system in one document, written in a natural language is a major step in the overall process of Software Engineering. Requirements Engineering is a process use to discover, analyze and validate system requirements. This process is needed in reducing software errors at the early stage of the development of software. The importance of each of the steps in Requirements Engineering is clearly explained in the context of using detailed problem statement from client/customer to get an overview of an existing system along with expectations from the new system. This paper elicits inadequate Requirements Engineering principle as the major cause of poor software development in developing nations using a case study of final year computer science students of a tertiary-education institution in Nigeria.

Keywords: client/customer, problem statement, requirements engineering, software developers

Procedia PDF Downloads 398
18798 Modelling and Optimization of Laser Cutting Operations

Authors: Hany Mohamed Abdu, Mohamed Hassan Gadallah, El-Giushi Mokhtar, Yehia Mahmoud Ismail

Abstract:

Laser beam cutting is one nontraditional machining process. This paper optimizes the parameters of Laser beam cutting machining parameters of Stainless steel (316L) by considering the effect of input parameters viz. power, oxygen pressure, frequency and cutting speed. Statistical design of experiments are carried in three different levels and process responses such as 'Average kerf taper (Ta)' and 'Surface Roughness (Ra)' are measured accordingly. A quadratic mathematical model (RSM) for each of the responses is developed as a function of the process parameters. Responses predicted by the models (as per Taguchi’s L27 OA) are employed to search for an optimal parametric combination to achieve desired yield of the process. RSM models are developed for mean responses, S/N ratio, and standard deviation of responses. Optimization models are formulated as single objective problem subject to process constraints. Models are formulated based on Analysis of Variance (ANOVA) using MATLAB environment. Optimum solutions are compared with Taguchi Methodology results.

Keywords: optimization, laser cutting, robust design, kerf width, Taguchi method, RSM and DOE

Procedia PDF Downloads 616
18797 Probing Multiple Relaxation Process in Zr-Cu Base Alloy Using Mechanical Spectroscopy

Authors: A. P. Srivastava, D. Srivastava, D. J. Browne

Abstract:

Relaxation dynamics of Zr44Cu40Al8Ag8 bulk metallic glass (BMG) has been probed using dynamic mechanical analyzer. The BMG sample was casted in the form of a plate of dimension 55 mm x 40 mm x 3 mm using tilt casting technique. X-ray diffraction and transmission electron microscope have been used for the microstructural characterization of as-cast BMG. For the mechanical spectroscopy study, samples in the form of a bar of size 55 mm X 2 mm X 3 mm were machined from the BMG plate. The mechanical spectroscopy was performed on dynamic mechanical analyzer (DMA) by 50 mm 3-point bending method in a nitrogen atmosphere. It was observed that two glass transition process were competing in supercooled liquid region around temperature 390°C and 430°C. The supercooled liquid state was completely characterized using DMA and differential scanning calorimeter (DSC). In addition to the main α-relaxation process, presence of β relaxation process around temperature 360°C; below the glass transition temperature was also observed. The β relaxation process could be described by Arrhenius law with the activation energy of 160 kJ/mole. The volume of the flow unit associated with this relaxation process has been estimated. The results from DMA study has been used to characterize the shear transformation zone in terms of activation volume and size. High fragility parameter value of 34 and higher activation volume indicates that this alloy could show good plasticity in supercooled liquid region. The possible mechanism for the relaxation processes has been discussed.

Keywords: DMA, glass transition, metallic glass, thermoplastic forming

Procedia PDF Downloads 289
18796 Tool Condition Monitoring of Ceramic Inserted Tools in High Speed Machining through Image Processing

Authors: Javier A. Dominguez Caballero, Graeme A. Manson, Matthew B. Marshall

Abstract:

Cutting tools with ceramic inserts are often used in the process of machining many types of superalloy, mainly due to their high strength and thermal resistance. Nevertheless, during the cutting process, the plastic flow wear generated in these inserts enhances and propagates cracks due to high temperature and high mechanical stress. This leads to a very variable failure of the cutting tool. This article explores the relationship between the continuous wear that ceramic SiAlON (solid solutions based on the Si3N4 structure) inserts experience during a high-speed machining process and the evolution of sparks created during the same process. These sparks were analysed through pictures of the cutting process recorded using an SLR camera. Features relating to the intensity and area of the cutting sparks were extracted from the individual pictures using image processing techniques. These features were then related to the ceramic insert’s crater wear area.

Keywords: ceramic cutting tools, high speed machining, image processing, tool condition monitoring, tool wear

Procedia PDF Downloads 290
18795 Rounded-off Measurements and Their Implication on Control Charts

Authors: Ran Etgar

Abstract:

The process of rounding off measurements in continuous variables is commonly encountered. Although it usually has minor effects, sometimes it can lead to poor outcomes in statistical process control using X ̅-chart. The traditional control limits can cause incorrect conclusions if applied carelessly. This study looks into the limitations of classical control limits, particularly the impact of asymmetry. An approach to determining the distribution function of the measured parameter (Y ̅) is presented, resulting in a more precise method to establish the upper and lower control limits. The proposed method, while slightly more complex than Shewhart's original idea, is still user-friendly and accurate and only requires the use of two straightforward tables.

Keywords: inaccurate measurement, SPC, statistical process control, rounded-off, control chart

Procedia PDF Downloads 24
18794 A Distributed Cryptographically Generated Address Computing Algorithm for Secure Neighbor Discovery Protocol in IPv6

Authors: M. Moslehpour, S. Khorsandi

Abstract:

Due to shortage in IPv4 addresses, transition to IPv6 has gained significant momentum in recent years. Like Address Resolution Protocol (ARP) in IPv4, Neighbor Discovery Protocol (NDP) provides some functions like address resolution in IPv6. Besides functionality of NDP, it is vulnerable to some attacks. To mitigate these attacks, Internet Protocol Security (IPsec) was introduced, but it was not efficient due to its limitation. Therefore, SEND protocol is proposed to automatic protection of auto-configuration process. It is secure neighbor discovery and address resolution process. To defend against threats on NDP’s integrity and identity, Cryptographically Generated Address (CGA) and asymmetric cryptography are used by SEND. Besides advantages of SEND, its disadvantages like the computation process of CGA algorithm and sequentially of CGA generation algorithm are considerable. In this paper, we parallel this process between network resources in order to improve it. In addition, we compare the CGA generation time in self-computing and distributed-computing process. We focus on the impact of the malicious nodes on the CGA generation time in the network. According to the result, although malicious nodes participate in the generation process, CGA generation time is less than when it is computed in a one-way. By Trust Management System, detecting and insulating malicious nodes is easier.

Keywords: NDP, IPsec, SEND, CGA, modifier, malicious node, self-computing, distributed-computing

Procedia PDF Downloads 276
18793 Methodological Deficiencies in Knowledge Representation Conceptual Theories of Artificial Intelligence

Authors: Nasser Salah Eldin Mohammed Salih Shebka

Abstract:

Current problematic issues in AI fields are mainly due to those of knowledge representation conceptual theories, which in turn reflected on the entire scope of cognitive sciences. Knowledge representation methods and tools are driven from theoretical concepts regarding human scientific perception of the conception, nature, and process of knowledge acquisition, knowledge engineering and knowledge generation. And although, these theoretical conceptions were themselves driven from the study of the human knowledge representation process and related theories; some essential factors were overlooked or underestimated, thus causing critical methodological deficiencies in the conceptual theories of human knowledge and knowledge representation conceptions. The evaluation criteria of human cumulative knowledge from the perspectives of nature and theoretical aspects of knowledge representation conceptions are affected greatly by the very materialistic nature of cognitive sciences. This nature caused what we define as methodological deficiencies in the nature of theoretical aspects of knowledge representation concepts in AI. These methodological deficiencies are not confined to applications of knowledge representation theories throughout AI fields, but also exceeds to cover the scientific nature of cognitive sciences. The methodological deficiencies we investigated in our work are: - The Segregation between cognitive abilities in knowledge driven models.- Insufficiency of the two-value logic used to represent knowledge particularly on machine language level in relation to the problematic issues of semantics and meaning theories. - Deficient consideration of the parameters of (existence) and (time) in the structure of knowledge. The latter requires that we present a more detailed introduction of the manner in which the meanings of Existence and Time are to be considered in the structure of knowledge. This doesn’t imply that it’s easy to apply in structures of knowledge representation systems, but outlining a deficiency caused by the absence of such essential parameters, can be considered as an attempt to redefine knowledge representation conceptual approaches, or if proven impossible; constructs a perspective on the possibility of simulating human cognition on machines. Furthermore, a redirection of the aforementioned expressions is required in order to formulate the exact meaning under discussion. This redirection of meaning alters the role of Existence and time factors to the Frame Work Environment of knowledge structure; and therefore; knowledge representation conceptual theories. Findings of our work indicate the necessity to differentiate between two comparative concepts when addressing the relation between existence and time parameters, and between that of the structure of human knowledge. The topics presented throughout the paper can also be viewed as an evaluation criterion to determine AI’s capability to achieve its ultimate objectives. Ultimately, we argue some of the implications of our findings that suggests that; although scientific progress may have not reached its peak, or that human scientific evolution has reached a point where it’s not possible to discover evolutionary facts about the human Brain and detailed descriptions of how it represents knowledge, but it simply implies that; unless these methodological deficiencies are properly addressed; the future of AI’s qualitative progress remains questionable.

Keywords: cognitive sciences, knowledge representation, ontological reasoning, temporal logic

Procedia PDF Downloads 107
18792 Integrated Evaluation of Green Design and Green Manufacturing Processes Using a Mathematical Model

Authors: Yuan-Jye Tseng, Shin-Han Lin

Abstract:

In this research, a mathematical model for integrated evaluation of green design and green manufacturing processes is presented. To design a product, there can be alternative options to design the detailed components to fulfill the same product requirement. In the design alternative cases, the components of the product can be designed with different materials and detailed specifications. If several design alternative cases are proposed, the different materials and specifications can affect the manufacturing processes. In this paper, a new concept for integrating green design and green manufacturing processes is presented. A green design can be determined based the manufacturing processes of the designed product by evaluating the green criteria including energy usage and environmental impact, in addition to the traditional criteria of manufacturing cost. With this concept, a mathematical model is developed to find the green design and the associated green manufacturing processes. In the mathematical model, the cost items include material cost, manufacturing cost, and green related cost. The green related cost items include energy cost and environmental cost. The objective is to find the decisions of green design and green manufacturing processes to achieve the minimized total cost. In practical applications, the decision-making can be made to select a good green design case and its green manufacturing processes. In this presentation, an example product is illustrated. It shows that the model is practical and useful for integrated evaluation of green design and green manufacturing processes.

Keywords: supply chain management, green supply chain, green design, green manufacturing, mathematical model

Procedia PDF Downloads 800
18791 A Case Study on the Development and Application of Media Literacy Education Program Based on Circular Learning

Authors: Kim Hyekyoung, Au Yunkyung

Abstract:

As media plays an increasingly important role in our lives, the age at which media usage begins is getting younger worldwide. Particularly, young children are exposed to media at an early age, making early childhood media literacy education an essential task. However, most existing early childhood media literacy education programs focus solely on teaching children how to use media, and practical implementation and application are challenging. Therefore, this study aims to develop a play-based early childhood media literacy education program utilizing topic-based media content and explore the potential application and impact of this program on young children's media literacy learning. Based on theoretical and literature review on media literacy education, analysis of existing educational programs, and a survey on the current status and teacher perceptions of media literacy education for preschool children, this study developed a media literacy education program for preschool children, considering the components of media literacy (understanding media characteristics, self-regulation, self-expression, critical understanding, ethical norms, and social communication). To verify the effectiveness of the program, 20 preschool children aged 5 from C City M Kindergarten were chosen as participants, and the program was implemented from March 28th to July 4th, 2022, once a week for a total of 7 sessions. The program was developed based on Gallenstain's (2003) iterative learning model (participation-exploration-explanation-extension-evaluation). To explore the quantitative changes before and after the program, a repeated measures analysis of variance was conducted, and qualitative analysis was employed to examine the observed process changes. It was found that after the application of the education program, media literacy levels such as understanding media characteristics, self-regulation, self-expression, critical understanding, ethical norms, and social communication significantly improved. The recursive learning-based early childhood media literacy education program developed in this study can be effectively applied to young children's media literacy education and help enhance their media literacy levels. In terms of observed process changes, it was confirmed that children learned about various topics, expressed their thoughts, and improved their ability to communicate with others using media content. These findings emphasize the importance of developing and implementing media literacy education programs and can contribute to empowering young children to safely and effectively utilize media in their media environment. The results of this study, exploring the potential application and impact of the recursive learning-based early childhood media literacy education program on young children's media literacy learning, demonstrated positive changes in young children's media literacy levels. These results go beyond teaching children how to use media and can help foster their ability to safely and effectively utilize media in their media environment. Additionally, to enhance young children's media literacy levels and create a safe media environment, diverse content and methodologies are needed, and the continuous development and evaluation of education programs should be conducted.

Keywords: young children, media literacy, recursive learning, education program

Procedia PDF Downloads 63
18790 The Optimization of TICSI in the Convergence Mechanism of Urban Water Management

Authors: M. Macchiaroli, L. Dolores, V. Pellecchia

Abstract:

With the recent Resolution n. 580/2019/R/idr, the Italian Regulatory Authority for Energy, Networks, and Environment (ARERA) for the Urban Water Management has introduced, for water managements characterized by persistent critical issues regarding the planning and organization of the service and the implementation of the necessary interventions for the improvement of infrastructures and management quality, a new mechanism for determining tariffs: the regulatory scheme of Convergence. The aim of this regulatory scheme is the overcoming of the Water Service Divided in order to improve the stability of the local institutional structures, technical quality, contractual quality, as well as in order to guarantee transparency elements for Users of the Service. Convergence scheme presupposes the identification of the cost items to be considered in the tariff in parametric terms, distinguishing three possible cases according to the type of historical data available to the Manager. The study, in particular, focuses on operations that have neither data on tariff revenues nor data on operating costs. In this case, the Manager's Constraint on Revenues (VRG) is estimated on the basis of a reference benchmark and becomes the starting point for defining the structure of the tariff classes, in compliance with the TICSI provisions (Integrated Text for tariff classes, ARERA's Resolution n. 665/2017/R/idr). The proposed model implements the recent studies on optimization models for the definition of tariff classes in compliance with the constraints dictated by TICSI in the application of the Convergence mechanism, proposing itself as a support tool for the Managers and the local water regulatory Authority in the decision-making process.

Keywords: decision-making process, economic evaluation of projects, optimizing tools, urban water management, water tariff

Procedia PDF Downloads 113
18789 Systemic Functional Grammar Analysis of Barack Obama's Second Term Inaugural Speech

Authors: Sadiq Aminu, Ahmed Lamido

Abstract:

This research studies Barack Obama’s second inaugural speech using Halliday’s Systemic Functional Grammar (SFG). SFG is a text grammar which describes how language is used, so that the meaning of the text can be better understood. The primary source of data in this research work is Barack Obama’s second inaugural speech which was obtained from the internet. The analysis of the speech was based on the ideational and textual metafunctions of Systemic Functional Grammar. Specifically, the researcher analyses the Process Types and Participants (ideational) and the Theme/Rheme (textual). It was found that material process (process of doing) was the most frequently used ‘Process type’ and ‘We’ which refers to the people of America was the frequently used ‘Theme’. Application of the SFG theory, therefore, gives a better meaning to Barack Obama’s speech.

Keywords: ideational, metafunction, rheme, textual, theme

Procedia PDF Downloads 147