Search results for: Approaches to study
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 13701

Search results for: Approaches to study

12711 Multi-agent On-line Monitor for the Safety of Critical Systems

Authors: Amer A. Dheedan

Abstract:

Operational safety of critical systems, such as nuclear power plants, industrial chemical processes and means of transportation, is a major concern for system engineers and operators. A means to assure that is on-line safety monitors that deliver three safety tasks; fault detection and diagnosis, alarm annunciation and fault controlling. While current monitors deliver these tasks, benefits and limitations in their approaches have at the same time been highlighted. Drawing from those benefits, this paper develops a distributed monitor based on semi-independent agents, i.e. a multiagent system, and monitoring knowledge derived from a safety assessment model of the monitored system. Agents are deployed hierarchically and provided with knowledge portions and collaboration protocols to reason and integrate over the operational conditions of the components of the monitored system. The monitor aims to address limitations arising from the large-scale, complicated behaviour and distributed nature of monitored systems and deliver the aforementioned three monitoring tasks effectively.

Keywords: Alarm annunciation, fault controlling, fault detection and diagnosis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1599
12710 Holistic Approach to Teaching Mathematics in Secondary School as a Means of Improving Students’ Comprehension of Study Material

Authors: Natalia Podkhodova, Olga Sheremeteva, Mariia Soldaeva

Abstract:

Creating favourable conditions for students’ comprehension of mathematical content is one of the primary problems in teaching mathematics in secondary school. The fact of comprehension includes the ability to build a working situational model and thus becomes an important means of solving mathematical problems. This paper describes a holistic approach to teaching mathematics designed to address the primary challenges of such teaching; specifically, the challenge of students’ comprehension. Essentially, this approach consists of (1) establishing links between the attributes of the notion: the sense, the meaning, and the term; (2) taking into account the components of student’s subjective experience—value-based emotions, contextual, procedural and communicative—during the educational process; (3) linking together different ways to present mathematical information; (4) identifying and leveraging the relationships between real, perceptual and conceptual (scientific) mathematical spaces by applying real-life situational modelling. The article describes approaches to the practical use of these foundational concepts. Identifying how proposed methods and techniques influence understanding of material used in teaching mathematics was the primary goal. The study included an experiment in which 256 secondary school students took part: 142 in the study group and 114 in the control group. All students in these groups had similar levels of achievement in math and studied math under the same curriculum. In the course of the experiment, comprehension of two topics — “Derivative” and “Trigonometric functions”—was evaluated. Control group participants were taught using traditional methods. Students in the study group were taught using the holistic method: under teacher’s guidance, they carried out assignments designed to establish linkages between notion’s characteristics, to convert information from one mode of presentation to another, as well as assignments that required the ability to operate with all modes of presentation. Identification, accounting for and transformation of subjective experience were associated with methods of stimulating the emotional value component of the studied mathematical content (discussions of lesson titles, assignments aimed to create study dominants, performing theme-related physical exercise ...) The use of techniques that forms inter-subject notions based on linkages between, perceptual real and mathematical conceptual spaces proved to be of special interest to the students. Results of the experiment were analysed by presenting students in each of the groups with a final test in each of the studied topics. The test included assignments that required building real situational models. Statistical analysis was used to aggregate test results. Pierson criterion x2 was used to reveal statistics significance of results (pass-fail the modelling test). Significant difference of results was revealed (p < 0.001), which allowed to conclude that students in the study group showed better comprehension of mathematical information than those in the control group. The total number of completed assignments of each student was analysed as well, with average results calculated for each group. Statistical significance of result differences against the quantitative criterion (number of completed assignments) was determined using Student’s t-test, which showed that students in the study group completed significantly more assignments than those in the control group (p = 0.0001). Authors thus come to the conclusion that suggested increase in the level of comprehension of study material took place as a result of applying implemented methods and techniques.

Keywords: Comprehension of mathematical content, holistic approach to teaching mathematics in secondary school, subjective experience, technology of the formation of inter-subject notions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 584
12709 Observations of Conformity in the Health Professions

Authors: Tanya N. Beran, Michelle A. Drefs, Ghazwan Altabbaa, Nouf Al Harbi, Noof Al Baz, Elizabeth Oddone Paolucci

Abstract:

Although interprofessional practice is a collaborative approach for problem solving among health professionals, its implementation can present challenges to its team members. In particular, they may feel pressured to agree with or conform to other members who share information that is contrary to their own understanding. Obtaining evidence of this phenomenon is challenging, as team members may underreport their conformity behaviors due to reasons such as social desirability. In this paper, a series of studies are reviewed in which several approaches to assessing conformity in the health care professions are tested. Simulations, questionnaires, and behavior checklists can be used to measure conformity behaviors. Insights from these studies show that a significant proportion of people conform either in the presence or absence of others, express a variety of verbal and nonverbal behaviors when considering whether to conform to others, may shift between conforming and moments later not conforming (and vice versa), and may not accurately report whether they conformed. A method of measuring conformity using the implicit bias test is also discussed. People at all levels in the healthcare system are encouraged to develop both formal and informal strategies to manage the conformity pressures that people face.

Keywords: Conformity, decision-making, interprofessional teams, medical simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 416
12708 A State-Of-The-Art Review on Web Services Adaptation

Authors: M. Velasco, D. While, P. Raju, J. Krasniewicz, A. Amini, L. Hernandez-Munoz

Abstract:

Web service adaptation involves the creation of adapters that solve Web services incompatibilities known as mismatches. Since the importance of Web services adaptation is increasing because of the frequent implementation and use of online Web services, this paper presents a literature review of web services to investigate the main methods of adaptation, their theoretical underpinnings and the metrics used to measure adapters performance. Eighteen publications were reviewed independently by two researchers. We found that adaptation techniques are needed to solve different types of problems that may arise due to incompatibilities in Web service interfaces, including protocols, messages, data and semantics that affect the interoperability of the services. Although adapters are non-invasive methods that can improve Web services interoperability and there are current approaches for service adaptation; there is, however, not yet one solution that fits all types of mismatches. Our results also show that only a few research projects incorporate theoretical frameworks and that metrics to measure adapters’ performance are very limited. We conclude that further research on software adaptation should improve current adaptation methods in different layers of the service interoperability and that an adaptation theoretical framework that incorporates a theoretical underpinning and measures of qualitative and quantitative performance needs to be created.

Keywords: Web services adapters, software adaptation, web services mismatches, web services interoperability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1863
12707 A Modified Cross Correlation in the Frequency Domain for Fast Pattern Detection Using Neural Networks

Authors: Hazem M. El-Bakry, Qiangfu Zhao

Abstract:

Recently, neural networks have shown good results for detection of a certain pattern in a given image. In our previous papers [1-5], a fast algorithm for pattern detection using neural networks was presented. Such algorithm was designed based on cross correlation in the frequency domain between the input image and the weights of neural networks. Image conversion into symmetric shape was established so that fast neural networks can give the same results as conventional neural networks. Another configuration of symmetry was suggested in [3,4] to improve the speed up ratio. In this paper, our previous algorithm for fast neural networks is developed. The frequency domain cross correlation is modified in order to compensate for the symmetric condition which is required by the input image. Two new ideas are introduced to modify the cross correlation algorithm. Both methods accelerate the speed of the fast neural networks as there is no need for converting the input image into symmetric one as previous. Theoretical and practical results show that both approaches provide faster speed up ratio than the previous algorithm.

Keywords: Fast Pattern Detection, Neural Networks, Modified Cross Correlation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1735
12706 Exploiting Machine Learning Techniques for the Enhancement of Acceptance Sampling

Authors: Aikaterini Fountoulaki, Nikos Karacapilidis, Manolis Manatakis

Abstract:

This paper proposes an innovative methodology for Acceptance Sampling by Variables, which is a particular category of Statistical Quality Control dealing with the assurance of products quality. Our contribution lies in the exploitation of machine learning techniques to address the complexity and remedy the drawbacks of existing approaches. More specifically, the proposed methodology exploits Artificial Neural Networks (ANNs) to aid decision making about the acceptance or rejection of an inspected sample. For any type of inspection, ANNs are trained by data from corresponding tables of a standard-s sampling plan schemes. Once trained, ANNs can give closed-form solutions for any acceptance quality level and sample size, thus leading to an automation of the reading of the sampling plan tables, without any need of compromise with the values of the specific standard chosen each time. The proposed methodology provides enough flexibility to quality control engineers during the inspection of their samples, allowing the consideration of specific needs, while it also reduces the time and the cost required for these inspections. Its applicability and advantages are demonstrated through two numerical examples.

Keywords: Acceptance Sampling, Neural Networks, Statistical Quality Control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1688
12705 Automatic Translation of Ada-ECATNet Using Rewriting Logic

Authors: N. Boudiaf

Abstract:

One major difficulty that faces developers of concurrent and distributed software is analysis for concurrency based faults like deadlocks. Petri nets are used extensively in the verification of correctness of concurrent programs. ECATNets are a category of algebraic Petri nets based on a sound combination of algebraic abstract types and high-level Petri nets. ECATNets have 'sound' and 'complete' semantics because of their integration in rewriting logic and its programming language Maude. Rewriting logic is considered as one of very powerful logics in terms of description, verification and programming of concurrent systems We proposed previously a method for translating Ada-95 tasking programs to ECATNets formalism (Ada-ECATNet) and we showed that ECATNets formalism provides a more compact translation for Ada programs compared to the other approaches based on simple Petri nets or Colored Petri nets. We showed also previously how the ECATNet formalism offers to Ada many validation and verification tools like simulation, Model Checking, accessibility analysis and static analysis. In this paper, we describe the implementation of our translation of the Ada programs into ECATNets.

Keywords: Ada tasking, Analysis, Automatic Translation, ECATNets, Maude, Rewriting Logic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1580
12704 Online Collaboration Learning: A Way to Enhance Students' Achievement at Kingdom of Bahrain

Authors: Jaflah H. Al-Ammary

Abstract:

The increasing recognition of the need for education to be closely aligned with team playing, project based learning and problem solving approaches has increase the interest in collaborative learning among university and college instructors. Using online collaboration learning in learning can enhance the outcome and achievement of students as well as improve their communication, critical thinking and personnel skills. The current research aims at examining the effect of OCL on the student's achievement at Kingdom of Bahrain. Numbers of objectives were set to achieve the aim of the research include: investigating the current situation regarding the collaborative learning and OCL at the Kingdom of Bahrain by identifying the advantages and effectiveness of OCL as a learning tool over traditional learning, examining the factors that affect OCL as well as examining the impact of OCL on the student's achievement. To achieve these objectives, quantitative method was adopted. Two hundred and thirty one questionnaires were distributed to students in different local and private universities at Kingdom of Bahrain. The findings of the research show that most of the students prefer to use FTFCL in learning and that OCL is already adopted in some universities especially in University of Bahrain. Moreover, the most factors affecting the adopted OCL are perceived readiness, and guidance and support.

Keywords: Collaborative learning, perceived readiness, student achievement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2287
12703 Near-Field Robust Adaptive Beamforming Based on Worst-Case Performance Optimization

Authors: Jing-ran Lin, Qi-cong Peng, Huai-zong Shao

Abstract:

The performance of adaptive beamforming degrades substantially in the presence of steering vector mismatches. This degradation is especially severe in the near-field, for the 3-dimensional source location is more difficult to estimate than the 2-dimensional direction of arrival in far-field cases. As a solution, a novel approach of near-field robust adaptive beamforming (RABF) is proposed in this paper. It is a natural extension of the traditional far-field RABF and belongs to the class of diagonal loading approaches, with the loading level determined based on worst-case performance optimization. However, different from the methods solving the optimal loading by iteration, it suggests here a simple closed-form solution after some approximations, and consequently, the optimal weight vector can be expressed in a closed form. Besides simplicity and low computational cost, the proposed approach reveals how different factors affect the optimal loading as well as the weight vector. Its excellent performance in the near-field is confirmed via a number of numerical examples.

Keywords: Robust adaptive beamforming (RABF), near-field, steering vector mismatches, diagonal loading, worst-case performanceoptimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1875
12702 Performance Analysis of Deterministic Stable Election Protocol Using Fuzzy Logic in Wireless Sensor Network

Authors: Sumanpreet Kaur, Harjit Pal Singh, Vikas Khullar

Abstract:

In Wireless Sensor Network (WSN), the sensor containing motes (nodes) incorporate batteries that can lament at some extent. To upgrade the energy utilization, clustering is one of the prototypical approaches for split sensor motes into a number of clusters where one mote (also called as node) proceeds as a Cluster Head (CH). CH selection is one of the optimization techniques for enlarging stability and network lifespan. Deterministic Stable Election Protocol (DSEP) is an effectual clustering protocol that makes use of three kinds of nodes with dissimilar residual energy for CH election. Fuzzy Logic technology is used to expand energy level of DSEP protocol by using fuzzy inference system. This paper presents protocol DSEP using Fuzzy Logic (DSEP-FL) CH by taking into account four linguistic variables such as energy, concentration, centrality and distance to base station. Simulation results show that our proposed method gives more effective results in term of a lifespan of network and stability as compared to the performance of other clustering protocols.

Keywords: Deterministic stable election protocol, energy model, fuzzy logic, wireless sensor network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 967
12701 Control Configuration Selection and Controller Design for Multivariable Processes Using Normalized Gain

Authors: R. Hanuma Naik, D. V. Ashok Kumar, K. S. R. Anjaneyulu

Abstract:

Several of the practical industrial control processes are multivariable processes. Due to the relation amid the variables (interaction), delay in the loops, it is very intricate to design a controller directly for these processes. So first, the interaction of the variables is analyzed using Relative Normalized Gain Array (RNGA), which considers the time constant, static gain and delay time of the processes. Based on the effect of RNGA, relative gain array (RGA) and NI, the pair (control configuration) of variables to be controlled by decentralized control is selected. The equivalent transfer function (ETF) of the process model is estimated as first order process with delay using the corresponding elements in the Relative gain array and Relative average residence time array (RARTA) of the processes. Secondly, a decentralized Proportional- Integral (PI) controller is designed for each ETF simply using frequency response specifications. Finally, the performance and robustness of the algorithm is comparing with existing related approaches to validate the effectiveness of the projected algorithm.

Keywords: Decentralized control, interaction, Multivariable processes, relative normalized gain array, relative average residence time array, steady state gain.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2312
12700 Modelling a Hospital as a Queueing Network: Analysis for Improving Performance

Authors: Emad Alenany, M. Adel El-Baz

Abstract:

In this paper, the flow of different classes of patients into a hospital is modelled and analyzed by using the queueing network analyzer (QNA) algorithm and discrete event simulation. Input data for QNA are the rate and variability parameters of the arrival and service times in addition to the number of servers in each facility. Patient flows mostly match real flow for a hospital in Egypt. Based on the analysis of the waiting times, two approaches are suggested for improving performance: Separating patients into service groups, and adopting different service policies for sequencing patients through hospital units. The separation of a specific group of patients, with higher performance target, to be served separately from the rest of patients requiring lower performance target, requires the same capacity while improves performance for the selected group of patients with higher target. Besides, it is shown that adopting the shortest processing time and shortest remaining processing time service policies among other tested policies would results in, respectively, 11.47% and 13.75% reduction in average waiting time relative to first come first served policy.

Keywords: Queueing network, discrete-event simulation, health applications, SPT.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1520
12699 Comparative Studies on Interactions of Synthetic and Natural Compounds with Hen Egg-White Lysozyme

Authors: Seifollah Bahramikia

Abstract:

Amyloid aggregation of polypeptides is related to a growing number of pathologic states known as amyloid disorders. In recent years, blocking or reversing amyloid aggregation via the use of small compounds are considered as two useful approaches in hampering the development of these diseases. In this research, we have compared the ability of several manganese-salen derivatives, as synthetic compounds, and apigenin, as a natural flavonoid, to inhibit of hen egg-white lysozyme (HEWL) aggregation, as an in vitro model system. Different spectroscopic analyses such as Thioflavin T (ThT) and Anilinonaphthalene-8-sulfonic acid (ANS) fluorescence, Congo red (CR) absorbance along with transmission electron microscopy were used in this work to monitor the HEWL aggregation kinetic and inhibition. Our results demonstrated that both type of compounds were capable to prevent the formation of lysozyme amyloid aggregation in vitro. In addition, our data indicated that synthetic compounds had higher activity to inhibit of the β-sheet structures relative to natural compound. Regarding the higher antioxidant activities of the salen derivatives, it can be concluded that in addition to aromatic rings of each of the compounds, the potent antioxidant properties of salen derivatives contributes to lower lysozyme fibril accumulation.

Keywords: Aggregation, anti-amyloidogenic, apigenin, hen egg white lysozyme, salen derivatives.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2076
12698 Cadmium Filter Cake of a Hydrometallurgical Zinc Smelter as a New Source for the Biological Synthesis of CdS Quantum Dots

Authors: Mehran Bakhshi, Mohammad Raouf Hosseini, Mohammadhosein Rahimi

Abstract:

The cadmium sulfide nanoparticles were synthesized from the nickel-cadmium cake of a hydrometallurgical zinc producing plant and sodium sulfide as Cd2+ and S-2 sources, respectively. Also, the synthesis process was performed by using the secretions of Bacillus licheniformis as bio-surfactant. Initially, in order to obtain a cadmium rich solution, two following steps were carried out: 1) Alkaline leaching for the removal of zinc oxide from the cake, and 2) acidic leaching to dissolve cadmium from the remained solid residue. Afterward, the obtained CdSO4 solution was used for the nanoparticle biosynthesis. Nanoparticles were characterized by the energy dispersive spectroscopy (EDS) and X-ray diffraction (XRD) to confirm the formation of CdS crystals with cubic structure. Also, transmission electron microscopy (TEM) was applied to determine the particle sizes which were in 2-10 nm range. Moreover, the presence of the protein containing bio-surfactants was approved by using infrared analysis (FTIR). In addition, the absorbance below 400 nm confirms quantum particles’ size. Finally, it was shown that valuable CdS quantum dots could be obtained from the industrial waste products via environment-friendly biological approaches.

Keywords: Biosynthesis, cadmium cake, cadmium sulfide, nanoparticle, zinc smelter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1526
12697 Shaking Force Balancing of Mechanisms: An Overview

Authors: Vigen Arakelian

Abstract:

The balancing of mechanisms is a well-known problem in the field of mechanical engineering because the variable dynamic loads cause vibrations, as well as noise, wear and fatigue of the machines. A mechanical system with unbalance shaking force and shaking moment transmits substantial vibration to the frame. Therefore, the objective of the balancing is to cancel or reduce the variable dynamic reactions transmitted to the frame. The resolution of this problem consists in the balancing of the shaking force and shaking moment. It can be fully or partially, by internal mass redistribution via adding counterweights or by modification of the mechanism's architecture via adding auxiliary structures. The balancing problems are of continue interest to researchers. Several laboratories around the world are very active in this area and new results are published regularly. However, despite its ancient history, mechanism balancing theory continues to be developed and new approaches and solutions are constantly being reported. Various surveys have been published that disclose particularities of balancing methods. The author believes that this is an appropriate moment to present a state of the art of the shaking force balancing studies completed by new research results. This paper presents an overview of methods devoted to the shaking force balancing of mechanisms, as well as the historical aspects of the origins and the evolution of the balancing theory of mechanisms.

Keywords: Inertia forces, shaking forces, balancing, dynamics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 534
12696 Business Skills Laboratory in Action: Combining a Practice Enterprise Model and an ERP-Simulation to a Comprehensive Business Learning Environment

Authors: Karoliina Nisula, Samuli Pekkola

Abstract:

Business education has been criticized for being too theoretical and distant from business life. Different types of experiential learning environments ranging from manual role-play to computer simulations and enterprise resource planning (ERP) systems have been used to introduce the realistic and practical experience into business learning. Each of these learning environments approaches business learning from a different perspective. The implementations tend to be individual exercises supplementing the traditional courses. We suggest combining them into a business skills laboratory resembling an actual workplace. In this paper, we present a concrete implementation of an ERP-supported business learning environment that is used throughout the first year undergraduate business curriculum. We validate the implementation by evaluating the learning outcomes through the different domains of Bloom’s taxonomy. We use the role-play oriented practice enterprise model as a comparison group. Our findings indicate that using the ERP simulation improves the poor and average students’ lower-level cognitive learning. On the affective domain, the ERP-simulation appears to enhance motivation to learn as well as perceived acquisition of practical hands-on skills.

Keywords: Business simulations, experiential learning, ERP systems, learning environments.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1140
12695 Modal Approach for Decoupling Damage Cost Dependencies in Building Stories

Authors: Haj Najafi Leila, Tehranizadeh Mohsen

Abstract:

Dependencies between diverse factors involved in probabilistic seismic loss evaluation are recognized to be an imperative issue in acquiring accurate loss estimates. Dependencies among component damage costs could be taken into account considering two partial distinct states of independent or perfectly-dependent for component damage states; however, in our best knowledge, there is no available procedure to take account of loss dependencies in story level. This paper attempts to present a method called "modal cost superposition method" for decoupling story damage costs subjected to earthquake ground motions dealt with closed form differential equations between damage cost and engineering demand parameters which should be solved in complex system considering all stories' cost equations by the means of the introduced "substituted matrixes of mass and stiffness". Costs are treated as probabilistic variables with definite statistic factors of median and standard deviation amounts and a presumed probability distribution. To supplement the proposed procedure and also to display straightforwardness of its application, one benchmark study has been conducted. Acceptable compatibility has been proven for the estimated damage costs evaluated by the new proposed modal and also frequently used stochastic approaches for entire building; however, in story level, insufficiency of employing modification factor for incorporating occurrence probability dependencies between stories has been revealed due to discrepant amounts of dependency between damage costs of different stories. Also, more dependency contribution in occurrence probability of loss could be concluded regarding more compatibility of loss results in higher stories than the lower ones, whereas reduction in incorporation portion of cost modes provides acceptable level of accuracy and gets away from time consuming calculations including some limited number of cost modes in high mode situation.

Keywords: Dependency, story-cost, cost modes, engineering demand parameter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1008
12694 Decision Support System for Hospital Selection in Emergency Medical Services: A Discrete Event Simulation Approach

Authors: D. Tedesco, G. Feletti, P. Trucco

Abstract:

The present study aims to develop a Decision Support System (DSS) to support operational decisions in Emergency Medical Service (EMS) systems regarding the assignment of medical emergency requests to Emergency Departments (ED). This problem is called “hospital selection” and concerns the definition of policies for the selection of the ED to which patients who require further treatment are transported by ambulance. The employed research methodology consists of a first phase of review of the technical-scientific literature concerning DSSs to support the EMS management and, in particular, the hospital selection decision. From the literature analysis, it emerged that current studies mainly focused on the EMS phases related to the ambulance service and consider a process that ends when the ambulance is available after completing a mission. Therefore, all the ED-related issues are excluded and considered as part of a separate process. Indeed, the most studied hospital selection policy turned out to be proximity, thus allowing to minimize the travelling time and to free-up the ambulance in the shortest possible time. The purpose of the present study consists in developing an optimization model for assigning medical emergency requests to the EDs also considering the expected time performance in the subsequent phases of the process, such as the case mix, the expected service throughput times, and the operational capacity of different EDs in hospitals. To this end, a Discrete Event Simulation (DES) model was created to compare different hospital selection policies. The model was implemented with the AnyLogic software and finally validated on a realistic case. The hospital selection policy that returned the best results was the minimization of the Time To Provider (TTP), considered as the time from the beginning of the ambulance journey to the ED at the beginning of the clinical evaluation by the doctor. Finally, two approaches were further compared: a static approach, based on a retrospective estimation of the TTP, and a dynamic approach, focused on a predictive estimation of the TTP which is determined with a constantly updated Winters forecasting model. Findings reveal that considering the minimization of TTP is the best hospital selection policy. It allows to significantly reducing service throughput times in the ED with a negligible increase in travel time. Furthermore, an immediate view of the saturation state of the ED is produced and the case mix present in the ED structures (i.e., the different triage codes) is considered, as different severity codes correspond to different service throughput times. Besides, the use of a predictive approach is certainly more reliable in terms on TTP estimation, than a retrospective approach. These considerations can support decision-makers in introducing different hospital selection policies to enhance EMSs performance.

Keywords: Emergency medical services, hospital selection, discrete event simulation, forecast model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 217
12693 Aspect Oriented Software Architecture

Authors: Pradip Peter Dey, Ronald F. Gonzales, Gordon W. Romney, Mohammad Amin, Bhaskar Raj Sinha

Abstract:

Natural language processing systems pose a unique challenge for software architectural design as system complexity has increased continually and systems cannot be easily constructed from loosely coupled modules. Lexical, syntactic, semantic, and pragmatic aspects of linguistic information are tightly coupled in a manner that requires separation of concerns in a special way in design, implementation and maintenance. An aspect oriented software architecture is proposed in this paper after critically reviewing relevant architectural issues. For the purpose of this paper, the syntactic aspect is characterized by an augmented context-free grammar. The semantic aspect is composed of multiple perspectives including denotational, operational, axiomatic and case frame approaches. Case frame semantics matured in India from deep thematic analysis. It is argued that lexical, syntactic, semantic and pragmatic aspects work together in a mutually dependent way and their synergy is best represented in the aspect oriented approach. The software architecture is presented with an augmented Unified Modeling Language.

Keywords: Language engineering, parsing, software design, user experience.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1736
12692 Development of Entrepreneurship in Industry on the Basis of Regulation of Transnational Production Chains in the Russian Arctic

Authors: E. N. Vetrova, L.V. Lapochkina, N. V. Nikulina

Abstract:

In the national economy, entrepreneurship plays the role of a buffer between economy and policy for it contributes to improving budget effectiveness and decreasing dependence of economy on the state. Entrepreneurship in industry makes it possible to increase the added value that is formed in production chains and to decrease dependence on import. Under the current circumstances, when sanctions are being imposed, this is especially relevant for Russia and for the realization of projects in the Russian Arctic. However, development of entrepreneurship in industry requires an enlightened state policy. The purpose of the research is elaboration of recommendations for improving economic effectiveness of the realization of the Arctic projects on the basis of conceptual proposals for the development of entrepreneurship in industry. The paper presents the studies of the extractive industry role in the Russian economy and proves its raw material character. The analysis of production chains in industry on the basis of the conception of the added value global chains demonstrated a low added value formed by Russian companies. The study of changes in the structure of economy based on systemic, statistical and comparative analyses revealed no positive changes in the structure of economy over the period under consideration. This is a manifestation of ineffectiveness of the Russian industrial policy in general and within the Arctic region in particular. The authors identified the problems information and implementation of the state industrial policy in the Arctic region and in the development of national entrepreneurship, analyzed the shortcomings of the current state policy in the sphere of the Russian industry. On the basis of the conducted studies, the authors formulated conceptual approaches to change the state policy in the Arctic. The basic idea of the authors is to substantiate the focus of the state regulation on the development of entrepreneurship in industry in the process of the Russian Arctic exploration. At the same time another problem is solved–that of the development of the manufacturing industry in the southern regions of the northwestern part of Russia. The criterion of effectiveness in this case is the economic effectiveness.

Keywords: Entrepreneurship in industry, global chains of the added value, government regulation, industrial policies, production chains in the Arctic region, economic effectiveness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1304
12691 A Propose of Personnel Assessment Method Including a Two-Way Assessment for Evaluating Evaluators and Employees

Authors: Shunsuke Saito, Kazuho Yoshimoto, Shunichi Ohmori, Sirawadee Arunyanart

Abstract:

In this paper, we suggest a mechanism of assessment that rater and Ratee (or employees) to convince. There are many problems exist in the personnel assessment. In particular, we were focusing on the three. (1) Raters are not sufficiently recognized assessment point. (2) Ratee are not convinced by the mechanism of assessment. (3) Raters (or Evaluators) and ratees have empathy. We suggest 1: Setting of "understanding of the assessment points." 2: Setting of "relative assessment ability." 3: Proposal of two-way assessment mechanism to solve these problems. As a prerequisite, it is assumed that there are multiple raters. This is because has been a growing importance of multi-faceted assessment. In this model, it determines the weight of each assessment point evaluators by the degree of understanding and assessment ability of raters and ratee. We used the ANP (Analytic Network Process) is a theory that an extension of the decision-making technique AHP (Analytic Hierarchy Process). ANP can be to address the problem of forming a network and assessment of Two-Way is possible. We apply this technique personnel assessment, the weights of rater of each point can be reasonably determined. We suggest absolute assessment for Two-Way assessment by ANP. We have verified that the consent of the two approaches is higher than conventional mechanism. Also, human resources consultant we got a comment about the application of the practice.

Keywords: Personnel assessment, ANP (analytic network process), two-way.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 781
12690 Evolutionary Algorithms for Learning Primitive Fuzzy Behaviors and Behavior Coordination in Multi-Objective Optimization Problems

Authors: Li Shoutao, Gordon Lee

Abstract:

Evolutionary robotics is concerned with the design of intelligent systems with life-like properties by means of simulated evolution. Approaches in evolutionary robotics can be categorized according to the control structures that represent the behavior and the parameters of the controller that undergo adaptation. The basic idea is to automatically synthesize behaviors that enable the robot to perform useful tasks in complex environments. The evolutionary algorithm searches through the space of parameterized controllers that map sensory perceptions to control actions, thus realizing a specific robotic behavior. Further, the evolutionary algorithm maintains and improves a population of candidate behaviors by means of selection, recombination and mutation. A fitness function evaluates the performance of the resulting behavior according to the robot-s task or mission. In this paper, the focus is in the use of genetic algorithms to solve a multi-objective optimization problem representing robot behaviors; in particular, the A-Compander Law is employed in selecting the weight of each objective during the optimization process. Results using an adaptive fitness function show that this approach can efficiently react to complex tasks under variable environments.

Keywords: adaptive fuzzy neural inference, evolutionary tuning

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1504
12689 Enhancing the Effectiveness of Air Defense Systems through Simulation Analysis

Authors: F. Felipe

Abstract:

Air Defense Systems contain high-value assets that are expected to fulfill their mission for several years - in many cases, even decades - while operating in a fast-changing, technology-driven environment. Thus, it is paramount that decision-makers can assess how effective an Air Defense System is in the face of new developing threats, as well as to identify the bottlenecks that could jeopardize the security of the airspace of a country. Given the broad extent of activities and the great variety of assets necessary to achieve the strategic objectives, a systems approach was taken in order to delineate the core requirements and the physical architecture of an Air Defense System. Then, value-focused thinking helped in the definition of the measures of effectiveness. Furthermore, analytical methods were applied to create a formal structure that preliminarily assesses such measures. To validate the proposed methodology, a powerful simulation was also used to determine the measures of effectiveness, now in more complex environments that incorporate both uncertainty and multiple interactions of the entities. The results regarding the validity of this methodology suggest that the approach can support decisions aimed at enhancing the capabilities of Air Defense Systems. In conclusion, this paper sheds some light on how consolidated approaches of Systems Engineering and Operations Research can be used as valid techniques for solving problems regarding a complex and yet vital matter.

Keywords: Air defense, effectiveness, system, simulation, decision-support.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 398
12688 The Prostitute’s Body in Diasporic Space: Sexualized China and Chineseness in Yu Dafu’s Sinking and Yan Geling’s The Lost Daughter of Happiness

Authors: Haizhi Wu

Abstract:

Sexualization brings together the interdependent experiences of prostitution and diaspora, establishing a masculine structure where a female’s body mediates the hegemony and sexuality of men from different races. Between eroticism and homesickness, writers of the Chinese diaspora develop sensual approaches to reflect on the diasporic experience and sexual frustration. Noticeably, Yu Dafu in Sinking and Yan Geling in The Lost Daughter of Happiness both take an interest in sexual encounters between an immature teen client and an erotically powerful prostitute in Japan or America, both countries considered colonizers in Chinese history. Both are utilizing the metaphor of body-space interplay to hint at the out-of-text transnational interactions, two writers, however, present distinct understandings of their bond with history and memory of the semi-colonial, semi-feudal China. Examining prostitutes’ bodies in multi-layer diasporic spaces, the central analysis of this paper works on the sexual, colonial, and historical representations of this bodily symbol and the prostitution’s engagement in negotiating with diaspora and “Chineseness”.

Keywords: Chineseness, Diasporic spaces, Prostitutes’s bodies, Sexualization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 353
12687 Determining the Width and Depths of Cut in Milling on the Basis of a Multi-Dexel Model

Authors: Jens Friedrich, Matthias A. Gebele, Armin Lechler, Alexander Verl

Abstract:

Chatter vibrations and process instabilities are the most important factors limiting the productivity of the milling process. Chatter can leads to damage of the tool, the part or the machine tool. Therefore, the estimation and prediction of the process stability is very important. The process stability depends on the spindle speed, the depth of cut and the width of cut. In milling, the process conditions are defined in the NC-program. While the spindle speed is directly coded in the NC-program, the depth and width of cut are unknown. This paper presents a new simulation based approach for the prediction of the depth and width of cut of a milling process. The prediction is based on a material removal simulation with an analytically represented tool shape and a multi-dexel approach for the workpiece. The new calculation method allows the direct estimation of the depth and width of cut, which are the influencing parameters of the process stability, instead of the removed volume as existing approaches do. The knowledge can be used to predict the stability of new, unknown parts. Moreover with an additional vibration sensor, the stability lobe diagram of a milling process can be estimated and improved based on the estimated depth and width of cut.

Keywords: Dexel, process stability, material removal, milling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2256
12686 Visualizing Imaging Pathways after Anatomy-Specific Follow-Up Imaging Recommendations

Authors: Thusitha Mabotuwana, Christopher S. Hall

Abstract:

Radiologists routinely make follow-up imaging recommendations, usually based on established clinical practice guidelines, such as the Fleischner Society guidelines for managing lung nodules. In order to ensure optimal care, it is important to make guideline-compliant recommendations, and also for patients to follow-up on these imaging recommendations in a timely manner. However, determining such compliance rates after a specific finding has been observed usually requires many time-consuming manual steps. To address some of these limitations with current approaches, in this paper we discuss a methodology to automatically detect finding-specific follow-up recommendations from radiology reports and create a visualization for relevant subsequent exams showing the modality transitions. Nearly 5% of patients who had a lung related follow-up recommendation continued to have at least eight subsequent outpatient CT exams during a seven year period following the recommendation. Radiologist and section chiefs can use the proposed tool to better understand how a specific patient population is being managed, identify possible deviations from established guideline recommendations and have a patient-specific graphical representation of the imaging pathways for an abstract view of the overall treatment path thus far.

Keywords: Follow-up recommendations, care pathways, imaging pathway visualization, follow-up tracking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1047
12685 Searching for Forensic Evidence in a Compromised Virtual Web Server against SQL Injection Attacks and PHP Web Shell

Authors: Gigih Supriyatno

Abstract:

SQL injection is one of the most common types of attacks and has a very critical impact on web servers. In the worst case, an attacker can perform post-exploitation after a successful SQL injection attack. In the case of forensics web servers, web server analysis is closely related to log file analysis. But sometimes large file sizes and different log types make it difficult for investigators to look for traces of attackers on the server. The purpose of this paper is to help investigator take appropriate steps to investigate when the web server gets attacked. We use attack scenarios using SQL injection attacks including PHP backdoor injection as post-exploitation. We perform post-mortem analysis of web server logs based on Hypertext Transfer Protocol (HTTP) POST and HTTP GET method approaches that are characteristic of SQL injection attacks. In addition, we also propose structured analysis method between the web server application log file, database application, and other additional logs that exist on the webserver. This method makes the investigator more structured to analyze the log file so as to produce evidence of attack with acceptable time. There is also the possibility that other attack techniques can be detected with this method. On the other side, it can help web administrators to prepare their systems for the forensic readiness.

Keywords: Web forensic, SQL injection, web shell, investigation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1248
12684 Indigenous Engagement: Towards a Culturally Sensitive Approach for Inclusive Economic Development

Authors: K. N. Penna, E. J. Hoffman, T. R. Carter

Abstract:

This paper suggests that cultural landscape management plans in an Indigenous context are more effective if designed by taking into consideration context-related social and cultural aspects, adopting people-centred and cultural-based approaches for instance. In relation to working in Indigenous and mining contexts, we draw upon and contribute to international policies on human rights that promote the development of management plans that are co-designed through genuine engagement processes. We suggest that the production of management plans that are built upon culturally relevant frameworks leads to more inclusive economic development, a greater sense of trust, and shared managerial responsibilities. In this paper, three issues related to Indigenous engagement and cultural landscape management plans will be addressed: (1) the need for effective communication channels between proponents and Traditional Owners (Australian original Aboriginal peoples who inhabited specific regions), (2) the use of a culturally sensitive approach to engage local representatives in the decision-making processes, and (3) how design of new management plans can help in establishing shared management.

Keywords: Culture-Centred Approach, Holons’ Hierarchy, Inclusive Economic Development, Indigenous Engagement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 460
12683 Reliability Evaluation of Composite Electric Power System Based On Latin Hypercube Sampling

Authors: R. Ashok Bakkiyaraj, N. Kumarappan

Abstract:

This paper investigates the suitability of Latin Hypercube sampling (LHS) for composite electric power system reliability analysis. Each sample generated in LHS is mapped into an equivalent system state and used for evaluating the annualized system and load point indices. DC loadflow based state evaluation model is solved for each sampled contingency state. The indices evaluated are loss of load probability, loss of load expectation, expected demand not served and expected energy not supplied. The application of the LHS is illustrated through case studies carried out using RBTS and IEEE-RTS test systems. Results obtained are compared with non-sequential Monte Carlo simulation and state enumeration analytical approaches. An error analysis is also carried out to check the LHS method’s ability to capture the distributions of the reliability indices. It is found that LHS approach estimates indices nearer to actual value and gives tighter bounds of indices than non-sequential Monte Carlo simulation.

Keywords: Composite power system, Latin Hypercube sampling, Monte Carlo simulation, Reliability evaluation, Variance analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3098
12682 Validating Condition-Based Maintenance Algorithms Through Simulation

Authors: Marcel Chevalier, Léo Dupont, Sylvain Marié, Frédérique Roffet, Elena Stolyarova, William Templier, Costin Vasile

Abstract:

Industrial end users are currently facing an increasing need to reduce the risk of unexpected failures and optimize their maintenance. This calls for both short-term analysis and long-term ageing anticipation. At Schneider Electric, we tackle those two issues using both Machine Learning and First Principles models. Machine learning models are incrementally trained from normal data to predict expected values and detect statistically significant short-term deviations. Ageing models are constructed from breaking down physical systems into sub-assemblies, then determining relevant degradation modes and associating each one to the right kinetic law. Validating such anomaly detection and maintenance models is challenging, both because actual incident and ageing data are rare and distorted by human interventions, and incremental learning depends on human feedback. To overcome these difficulties, we propose to simulate physics, systems and humans – including asset maintenance operations – in order to validate the overall approaches in accelerated time and possibly choose between algorithmic alternatives.

Keywords: Degradation models, ageing, anomaly detection, soft sensor, incremental learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 315