Search results for: web based instruction
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 28577

Search results for: web based instruction

26777 Predicting the Human Impact of Natural Onset Disasters Using Pattern Recognition Techniques and Rule Based Clustering

Authors: Sara Hasani

Abstract:

This research focuses on natural sudden onset disasters characterised as ‘occurring with little or no warning and often cause excessive injuries far surpassing the national response capacities’. Based on the panel analysis of the historic record of 4,252 natural onset disasters between 1980 to 2015, a predictive method was developed to predict the human impact of the disaster (fatality, injured, homeless) with less than 3% of errors. The geographical dispersion of the disasters includes every country where the data were available and cross-examined from various humanitarian sources. The records were then filtered into 4252 records of the disasters where the five predictive variables (disaster type, HDI, DRI, population, and population density) were clearly stated. The procedure was designed based on a combination of pattern recognition techniques and rule-based clustering for prediction and discrimination analysis to validate the results further. The result indicates that there is a relationship between the disaster human impact and the five socio-economic characteristics of the affected country mentioned above. As a result, a framework was put forward, which could predict the disaster’s human impact based on their severity rank in the early hours of disaster strike. The predictions in this model were outlined in two worst and best-case scenarios, which respectively inform the lower range and higher range of the prediction. A necessity to develop the predictive framework can be highlighted by noticing that despite the existing research in literature, a framework for predicting the human impact and estimating the needs at the time of the disaster is yet to be developed. This can further be used to allocate the resources at the response phase of the disaster where the data is scarce.

Keywords: disaster management, natural disaster, pattern recognition, prediction

Procedia PDF Downloads 153
26776 Gene Prediction in DNA Sequences Using an Ensemble Algorithm Based on Goertzel Algorithm and Anti-Notch Filter

Authors: Hamidreza Saberkari, Mousa Shamsi, Hossein Ahmadi, Saeed Vaali, , MohammadHossein Sedaaghi

Abstract:

In the recent years, using signal processing tools for accurate identification of the protein coding regions has become a challenge in bioinformatics. Most of the genomic signal processing methods is based on the period-3 characteristics of the nucleoids in DNA strands and consequently, spectral analysis is applied to the numerical sequences of DNA to find the location of periodical components. In this paper, a novel ensemble algorithm for gene selection in DNA sequences has been presented which is based on the combination of Goertzel algorithm and anti-notch filter (ANF). The proposed algorithm has many advantages when compared to other conventional methods. Firstly, it leads to identify the coding protein regions more accurate due to using the Goertzel algorithm which is tuned at the desired frequency. Secondly, faster detection time is achieved. The proposed algorithm is applied on several genes, including genes available in databases BG570 and HMR195 and their results are compared to other methods based on the nucleotide level evaluation criteria. Implementation results show the excellent performance of the proposed algorithm in identifying protein coding regions, specifically in identification of small-scale gene areas.

Keywords: protein coding regions, period-3, anti-notch filter, Goertzel algorithm

Procedia PDF Downloads 387
26775 Copula-Based Estimation of Direct and Indirect Effects in Path Analysis Model

Authors: Alam Ali, Ashok Kumar Pathak

Abstract:

Path analysis is a statistical technique used to evaluate the strength of the direct and indirect effects of variables. One or more structural regression equations are used to estimate a series of parameters in order to find the better fit of data. Sometimes, exogenous variables do not show a significant strength of their direct and indirect effect when the assumption of classical regression (ordinary least squares (OLS)) are violated by the nature of the data. The main motive of this article is to investigate the efficacy of the copula-based regression approach over the classical regression approach and calculate the direct and indirect effects of variables when data violates the OLS assumption and variables are linked through an elliptical copula. We perform this study using a well-organized numerical scheme. Finally, a real data application is also presented to demonstrate the performance of the superiority of the copula approach.

Keywords: path analysis, copula-based regression models, direct and indirect effects, k-fold cross validation technique

Procedia PDF Downloads 72
26774 Reed: An Approach Towards Quickly Bootstrapping Multilingual Acoustic Models

Authors: Bipasha Sen, Aditya Agarwal

Abstract:

Multilingual automatic speech recognition (ASR) system is a single entity capable of transcribing multiple languages sharing a common phone space. Performance of such a system is highly dependent on the compatibility of the languages. State of the art speech recognition systems are built using sequential architectures based on recurrent neural networks (RNN) limiting the computational parallelization in training. This poses a significant challenge in terms of time taken to bootstrap and validate the compatibility of multiple languages for building a robust multilingual system. Complex architectural choices based on self-attention networks are made to improve the parallelization thereby reducing the training time. In this work, we propose Reed, a simple system based on 1D convolutions which uses very short context to improve the training time. To improve the performance of our system, we use raw time-domain speech signals directly as input. This enables the convolutional layers to learn feature representations rather than relying on handcrafted features such as MFCC. We report improvement on training and inference times by atleast a factor of 4x and 7.4x respectively with comparable WERs against standard RNN based baseline systems on SpeechOcean's multilingual low resource dataset.

Keywords: convolutional neural networks, language compatibility, low resource languages, multilingual automatic speech recognition

Procedia PDF Downloads 123
26773 Cryptography Based Authentication Methods

Authors: Mohammad A. Alia, Abdelfatah Aref Tamimi, Omaima N. A. Al-Allaf

Abstract:

This paper reviews a comparison study on the most common used authentication methods. Some of these methods are actually based on cryptography. In this study, we show the main cryptographic services. Also, this study presents a specific discussion about authentication service, since the authentication service is classified into several categorizes according to their methods. However, this study gives more about the real life example for each of the authentication methods. It talks about the simplest authentication methods as well about the available biometric authentication methods such as voice, iris, fingerprint, and face authentication.

Keywords: information security, cryptography, system access control, authentication, network security

Procedia PDF Downloads 471
26772 Configuration Design and Optimization of the Movable Leg-Foot Lunar Soft-Landing Device

Authors: Shan Jia, Jinbao Chen, Jinhua Zhou, Jiacheng Qian

Abstract:

Lunar exploration is a necessary foundation for deep-space exploration. For the functional limitations of the fixed landers which are widely used currently and are to expand the detection range by the use of wheeled rovers with unavoidable path-repeatability, a movable lunar soft-landing device based on cantilever type buffer mechanism and leg-foot type walking mechanism is presented. Firstly, a 20 DoFs quadruped configuration based on pushrod is proposed. The configuration is of the bionic characteristics such as hip, knee and ankle joints, and can make the kinematics of the whole mechanism unchanged before and after buffering. Secondly, the multi-function main/auxiliary buffers based on crumple-energy absorption and screw-nut mechanism, as well as the telescopic device which could be used to protect the plantar force sensors during the buffer process are designed. Finally, the kinematic model of the whole mechanism is established, and the configuration optimization of the whole mechanism is completed based on the performance requirements of slope adaptation and obstacle crossing. This research can provide a technical solution integrating soft-landing, large-scale inspection and material-transfer for future lunar exploration and even mars exploration, and can also serve as the technical basis for developing the reusable landers.

Keywords: configuration design, lunar soft-landing device, movable, optimization

Procedia PDF Downloads 158
26771 Streamlines: Paths of Fluid Flow through Sandstone Samples Based on Computed Microtomography

Authors: Ł. Kaczmarek, T. Wejrzanowski, M. Maksimczuk

Abstract:

The study presents the use of the numerical calculations based on high-resolution computed microtomography in analysis of fluid flow through Miocene sandstones. Therefore, the permeability studies of rocks were performed. Miocene samples were taken from well S-3, located in the eastern part of the Carpathian Foredeep. For aforementioned analysis, two series of X-ray irradiation were performed. The first set of samples was selected to obtain the spatial distribution of grains and pores. At this stage of the study length of voxel side amounted 27 microns. The next set of X-ray irradation enabled recognition of microstructural components as well as petrophysical features. The length of voxel side in this stage was up to 2 µm. Based on this study, the samples were broken down into two distinct groups. The first one represents conventional reservoir deposits, in opposite to second one - unconventional type. Appropriate identification of petrophysical parameters such as porosity and permeability of the formation is a key element for optimization of the reservoir development.

Keywords: grains, permeability, pores, pressure distribution

Procedia PDF Downloads 254
26770 Detection of Nanotoxic Material Using DNA Based QCM

Authors: Juneseok You, Chanho Park, Kuehwan Jang, Sungsoo Na

Abstract:

Sensing of nanotoxic materials is strongly important, as their engineering applications are growing recently and results in that nanotoxic material can harmfully influence human health and environment. In current study we report the quartz crystal microbalance (QCM)-based, in situ and real-time sensing of nanotoxic-material by frequency shift. We propose the in situ detection of nanotoxic material of zinc oxice by using QCM functionalized with a taget-specific DNA. Since the mass of a target material is comparable to that of an atom, the mass change caused by target binding to DNA on the quartz electrode is so small that it is practically difficult to detect the ions at low concentrations. In our study, we have demonstrated the in-situ and fast detection of zinc oxide using the quartz crystal microbalance (QCM). The detection was derived from the DNA hybridization between the DNA on the quartz electrode. The results suggest that QCM-based detection opens a new avenue for the development of a practical water-testing sensor.

Keywords: nanotoxic material, qcm, frequency, in situ sensing

Procedia PDF Downloads 422
26769 Development of a Congestion Controller of Computer Network Using Artificial Intelligence Algorithm

Authors: Mary Anne Roa

Abstract:

Congestion in network occurs due to exceed in aggregate demand as compared to the accessible capacity of the resources. Network congestion will increase as network speed increases and new effective congestion control methods are needed, especially for today’s very high speed networks. To address this undeniably global issue, the study focuses on the development of a fuzzy-based congestion control model concerned with allocating the resources of a computer network such that the system can operate at an adequate performance level when the demand exceeds or is near the capacity of the resources. Fuzzy logic based models have proven capable of accurately representing a wide variety of processes. The model built is based on bandwidth, the aggregate incoming traffic and the waiting time. The theoretical analysis and simulation results show that the proposed algorithm provides not only good utilization but also low packet loss.

Keywords: congestion control, queue management, computer networks, fuzzy logic

Procedia PDF Downloads 397
26768 The Estimation of Human Vital Signs Complexity

Authors: L. Bikulciene, E. Venskaityte, G. Jarusevicius

Abstract:

Non-stationary and nonlinear signals generated by living complex systems defy traditional mechanistic approaches, which are based on homeostasis. Previous our studies have shown that the evaluation of the interactions of physiological signals by using special analysis methods is suitable for observation of physiological processes. It is demonstrated the possibility of using deep physiological model, based interpretation of the changes of the human body’s functional states combined with an application of the analytical method based on matrix theory for the physiological signals analysis, which was applied on high risk cardiac patients. It is shown that evaluation of cardiac signals interactions show peculiar for each individual functional changes at the onset of hemodynamic restoration procedure. Therefore we suggest that the alterations of functional state of the body, after patients overcome surgery can be complemented by the data received from the suggested approach of the evaluation of functional variables interactions.

Keywords: cardiac diseases, complex systems theory, ECG analysis, matrix analysis

Procedia PDF Downloads 344
26767 H.263 Based Video Transceiver for Wireless Camera System

Authors: Won-Ho Kim

Abstract:

In this paper, a design of H.263 based wireless video transceiver is presented for wireless camera system. It uses standard WIFI transceiver and the covering area is up to 100m. Furthermore the standard H.263 video encoding technique is used for video compression since wireless video transmitter is unable to transmit high capacity raw data in real time and the implemented system is capable of streaming at speed of less than 1Mbps using NTSC 720x480 video.

Keywords: wireless video transceiver, video surveillance camera, H.263 video encoding digital signal processing

Procedia PDF Downloads 364
26766 Frame Camera and Event Camera in Stereo Pair for High-Resolution Sensing

Authors: Khen Cohen, Daniel Yankelevich, David Mendlovic, Dan Raviv

Abstract:

We present a 3D stereo system for high-resolution sensing in both the spatial and the temporal domains by combining a frame-based camera and an event-based camera. We establish a method to merge both devices into one unite system and introduce a calibration process, followed by a correspondence technique and interpolation algorithm for 3D reconstruction. We further provide quantitative analysis about our system in terms of depth resolution and additional parameter analysis. We show experimentally how our system performs temporal super-resolution up to effectively 1ms and can detect fast-moving objects and human micro-movements that can be used for micro-expression analysis. We also demonstrate how our method can extract colored events for an event-based camera without any degradation in the spatial resolution, compared to a colored filter array.

Keywords: DVS-CIS stereo vision, micro-movements, temporal super-resolution, 3D reconstruction

Procedia PDF Downloads 297
26765 A Modified NSGA-II Algorithm for Solving Multi-Objective Flexible Job Shop Scheduling Problem

Authors: Aydin Teymourifar, Gurkan Ozturk, Ozan Bahadir

Abstract:

NSGA-II is one of the most well-known and most widely used evolutionary algorithms. In addition to its new versions, such as NSGA-III, there are several modified types of this algorithm in the literature. In this paper, a hybrid NSGA-II algorithm has been suggested for solving the multi-objective flexible job shop scheduling problem. For a better search, new neighborhood-based crossover and mutation operators are defined. To create new generations, the neighbors of the selected individuals by the tournament selection are constructed. Also, at the end of each iteration, before sorting, neighbors of a certain number of good solutions are derived, except for solutions protected by elitism. The neighbors are generated using a constraint-based neural network that uses various constructs. The non-dominated sorting and crowding distance operators are same as the classic NSGA-II. A comparison based on some multi-objective benchmarks from the literature shows the efficiency of the algorithm.

Keywords: flexible job shop scheduling problem, multi-objective optimization, NSGA-II algorithm, neighborhood structures

Procedia PDF Downloads 229
26764 Viral Metagenomics Revealed a Novel Cardiovirus in Feces of Wild Rats

Authors: Asif Mahmood, Shama Shama, Hao Ni, Hao Wang, Yu Ling, Hui Xu, Shixing Yang, Qais Ahmad Naseer, Wen Zhang

Abstract:

Cardiovirus is a genus of viruses belonging to the family Picornaviridae. Here, we used viral metagenomic techniques to detect the viral nucleic acid in the fecal samples from wild rats in Zhenjiang city in China. Fecal samples were collected from 20 wild rats and pooled into four sample pools and then subjected to libraries construction which were then sequenced on Illumina MiSeq platform. The sequenced reads were analyzed using viral metagenomic analysis pipeline. A novel cardiovirus from feces of a wild rat was identified, named amzj-2018, of which the complete genome was acquired. Phylogenetic analysis based on the complete amino acid sequence of polyprotein revealed that amzj-2018 formed a separate branch located between clusters of Saffold virus and Rat Theilovirus 1 (RTV-1). Phylogenetic analysis based on different regions of the polyproteins, including P1, P2, P3, and P2+P3, respectively, showed discordant trees, where the tree based on P3 region indicated that amzj-2018 clustered separately between Theiler's murine encephalomyelitis virus and RTV-1. The complete genome of a cardiovirus was determined from the feces of wild rats which belonged to a novel type of cardiovirus based on phylogenetic analysis. Whether it is associated with disease needs further investigation.

Keywords: cardiovirus, viral metagenomics, genomic organization, phylogenetic analysis

Procedia PDF Downloads 18
26763 Development of Open Source Geospatial Certification Model Based on Geospatial Technology Competency Model

Authors: Tanzeel Ur Rehman Khan, Franz Josef Behr, Phillip Davis

Abstract:

Open source geospatial certifications are needed in geospatial technology education and industry sector. In parallel with proprietary software, free and open source software solutions become important in geospatial technology research and play an important role for the growth of the geospatial industry. ESRI, GISCI (GIS Certification Institute), ASPRS (American Society of Photogrammetry and remote sensing), and Meta spatial are offering certifications on proprietary and open source software. These are portfolio and competency based certifications depending on GIS Body of Knowledge (Bok). The analysis of these certification approaches might lead to the discovery of some gaps in them and will open a new way to develop certifications related to the geospatial open source (OS). This new certification will investigate the different geospatial competencies according to open source tools that help to identify geospatial professionals and strengthen the geospatial academic content. The goal of this research is to introduce a geospatial certification model based on geospatial technology competency model (GTCM).The developed certification will not only incorporate the importance of geospatial education and production of the geospatial competency-based workforce in universities and companies (private or public) as well as describe open source solutions with tools and technology. Job analysis, market analysis, survey analysis of this certification opens a new horizon for business as well.

Keywords: geospatial certification, open source, geospatial technology competency model, geoscience

Procedia PDF Downloads 566
26762 Improvement of Camera Calibration Based on the Relationship between Focal Length and Aberration Coefficient

Authors: Guorong Sui, Xingwei Jia, Chenhui Yin, Xiumin Gao

Abstract:

In the processing of camera-based high precision and non-contact measurement, the geometric-optical aberration is always inevitably disturbing the measuring system. Moreover, the aberration is different with the different focal length, which will increase the difficulties of the system’s calibration. Therefore, to understand the relationship between the focal length as a function of aberration properties is a very important issue to the calibration of the measuring systems. In this study, we propose a new mathematics model, which is based on the plane calibration method by Zhang Zhengyou, and establish a relationship between the focal length and aberration coefficient. By using the mathematics model and carefully modified compensation templates, the calibration precision of the system can be dramatically improved. The experiment results show that the relative error is less than 1%. It is important for optoelectronic imaging systems that apply to measure, track and position by changing the camera’s focal length.

Keywords: camera calibration, aberration coefficient, vision measurement, focal length, mathematics model

Procedia PDF Downloads 364
26761 A Comparative Study of Sampling-Based Uncertainty Propagation with First Order Error Analysis and Percentile-Based Optimization

Authors: M. Gulam Kibria, Shourav Ahmed, Kais Zaman

Abstract:

In system analysis, the information on the uncertain input variables cause uncertainty in the system responses. Different probabilistic approaches for uncertainty representation and propagation in such cases exist in the literature. Different uncertainty representation approaches result in different outputs. Some of the approaches might result in a better estimation of system response than the other approaches. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge (MUQC) has posed challenges about uncertainty quantification. Subproblem A, the uncertainty characterization subproblem, of the challenge posed is addressed in this study. In this subproblem, the challenge is to gather knowledge about unknown model inputs which have inherent aleatory and epistemic uncertainties in them with responses (output) of the given computational model. We use two different methodologies to approach the problem. In the first methodology we use sampling-based uncertainty propagation with first order error analysis. In the other approach we place emphasis on the use of Percentile-Based Optimization (PBO). The NASA Langley MUQC’s subproblem A is developed in such a way that both aleatory and epistemic uncertainties need to be managed. The challenge problem classifies each uncertain parameter as belonging to one the following three types: (i) An aleatory uncertainty modeled as a random variable. It has a fixed functional form and known coefficients. This uncertainty cannot be reduced. (ii) An epistemic uncertainty modeled as a fixed but poorly known physical quantity that lies within a given interval. This uncertainty is reducible. (iii) A parameter might be aleatory but sufficient data might not be available to adequately model it as a single random variable. For example, the parameters of a normal variable, e.g., the mean and standard deviation, might not be precisely known but could be assumed to lie within some intervals. It results in a distributional p-box having the physical parameter with an aleatory uncertainty, but the parameters prescribing its mathematical model are subjected to epistemic uncertainties. Each of the parameters of the random variable is an unknown element of a known interval. This uncertainty is reducible. From the study, it is observed that due to practical limitations or computational expense, the sampling is not exhaustive in sampling-based methodology. That is why the sampling-based methodology has high probability of underestimating the output bounds. Therefore, an optimization-based strategy to convert uncertainty described by interval data into a probabilistic framework is necessary. This is achieved in this study by using PBO.

Keywords: aleatory uncertainty, epistemic uncertainty, first order error analysis, uncertainty quantification, percentile-based optimization

Procedia PDF Downloads 240
26760 Architecture for QoS Based Service Selection Using Local Approach

Authors: Gopinath Ganapathy, Chellammal Surianarayanan

Abstract:

Services are growing rapidly and generally they are aggregated into a composite service to accomplish complex business processes. There may be several services that offer the same required function of a particular task in a composite service. Hence a choice has to be made for selecting suitable services from alternative functionally similar services. Quality of Service (QoS)plays as a discriminating factor in selecting which component services should be selected to satisfy the quality requirements of a user during service composition. There are two categories of approaches for QoS based service selection, namely global and local approaches. Global approaches are known to be Non-Polynomial (NP) hard in time and offer poor scalability in large scale composition. As an alternative to global methods, local selection methods which reduce the search space by breaking up the large/complex problem of selecting services for the workflow into independent sub problems of selecting services for individual tasks are coming up. In this paper, distributed architecture for selecting services based on QoS using local selection is presented with an overview of local selection methodology. The architecture describes the core components, namely, selection manager and QoS manager needed to implement the local approach and their functions. Selection manager consists of two components namely constraint decomposer which decomposes the given global or workflow level constraints in local or task level constraints and service selector which selects appropriate service for each task with maximum utility, satisfying the corresponding local constraints. QoS manager manages the QoS information at two levels namely, service class level and individual service level. The architecture serves as an implementation model for local selection.

Keywords: architecture of service selection, local method for service selection, QoS based service selection, approaches for QoS based service selection

Procedia PDF Downloads 426
26759 Assessing the Leadership Succession Plan in Faith-Based Senior High Schools in Ghana and Its Associated Challenges

Authors: J. E. Cobbinah

Abstract:

One of the most challenging issues confronting schools is good leadership succession planning. Experts argue that, although the idea of leadership succession planning is one of the strategies or practices that can help sustain improvement and promote continuity of good leadership, seem to have been neglected in many schools over the years. Appointment of head teachers in senior high schools is based on long service or one’s ability to demonstrate his/her competence in a leadership selection interview. There is no clear and well-structured leadership succession plan, before leadership position is filled, while school leadership succession planning seem to be an issue that nobody talks about. In faith-based schools the issue is even worse, because religious groups impose whoever they consider strong in the faith on schools as leaders, irrespective of the individual competence, ability to take up challenges associated with individuals’ preparedness to take up leadership position. Therefore, the present study examined the nature (including type) of leadership succession plans in faith-based senior high schools and its associated challenges. Convergent mixed method design was employed to effectively achieve the objectives of the study. The data collection strategies involved the use of interviews, questionnaires, and reviews of secondary data. The data was gathered from students, school leaders (head teachers, deputy heads, and head of departments), selected parents teachers associated members, school management committee members and members from school governors. The results show that governors of faith-based schools are making efforts to enhance education quality, by making school leadership accountable, the absence and the neglect of clear, and well-structured leadership succession plan has some negative outcomes. Unsustainable students’ academic performance, lack of support from existing staffs and senior leaders and lack of support in the implementation of school improvement plan. It would be concluded that, faith-based schools should focus on leadership competence and abilities in the selection process of potential school leaders to achieve a good succession plan rather than appointing leaders who are affiliates of one’s faith.

Keywords: school leadership, succession planning, faith-based schools, school governors

Procedia PDF Downloads 445
26758 Research on Air pollution Spatiotemporal Forecast Model Based on LSTM

Authors: JingWei Yu, Hong Yang Yu

Abstract:

At present, the increasingly serious air pollution in various cities of China has made people pay more attention to the air quality index(hereinafter referred to as AQI) of their living areas. To face this situation, it is of great significance to predict air pollution in heavily polluted areas. In this paper, based on the time series model of LSTM, a spatiotemporal prediction model of PM2.5 concentration in Mianyang, Sichuan Province, is established. The model fully considers the temporal variability and spatial distribution characteristics of PM2.5 concentration. The spatial correlation of air quality at different locations is based on the Air quality status of other nearby monitoring stations, including AQI and meteorological data to predict the air quality of a monitoring station. The experimental results show that the method has good prediction accuracy that the fitting degree with the actual measured data reaches more than 0.7, which can be applied to the modeling and prediction of the spatial and temporal distribution of regional PM2.5 concentration.

Keywords: LSTM, PM2.5, neural networks, spatio-temporal prediction

Procedia PDF Downloads 134
26757 Distribution and Characterization of Thermal Springs in Northern Oman

Authors: Fahad Al Shidi, Reginald Victor

Abstract:

This study was conducted in Northern Oman to assess the physical and chemical characteristics of 40 thermal springs distributed in Al Hajar Mountains in northern Oman. Physical measurements of water samples were carried out in two main seasons in Oman (winter and summer 2019). Studied springs were classified into three groups based on water temperature, four groups based on water pH values and two groups based on conductivity. Ten thermal alkaline springs that originated in Ophiolite (Samail Napp) were dominated by high pH (> 11), elevated concentration of Cl- and Na+ ions, relatively low temperature and discharge ratio. Other springs in the Hajar Super Group massif recorded high concentrations of Ca2+ and SO2-4 ions controlled by rock dominance, geochemistry processes, and mineralization. There was only one spring which has brackish water with very high conductivity (5500 µs/cm) and Total Dissolved Solids and it is not suitable for irrigation purposes because of the high abundance of Na+, Cl−, and Ca2+ ions.

Keywords: alkaline springs, geothermal, HSG, ophiolite

Procedia PDF Downloads 143
26756 Developing a Knowledge-Based Lean Six Sigma Model to Improve Healthcare Leadership Performance

Authors: Yousuf N. Al Khamisi, Eduardo M. Hernandez, Khurshid M. Khan

Abstract:

Purpose: This paper presents a model of a Knowledge-Based (KB) using Lean Six Sigma (L6σ) principles to enhance the performance of healthcare leadership. Design/methodology/approach: Using L6σ principles to enhance healthcare leaders’ performance needs a pre-assessment of the healthcare organisation’s capabilities. The model will be developed using a rule-based approach of KB system. Thus, KB system embeds Gauging Absence of Pre-requisite (GAP) for benchmarking and Analytical Hierarchy Process (AHP) for prioritization. A comprehensive literature review will be covered for the main contents of the model with a typical output of GAP analysis and AHP. Findings: The proposed KB system benchmarks the current position of healthcare leadership with the ideal benchmark one (resulting from extensive evaluation by the KB/GAP/AHP system of international leadership concepts in healthcare environments). Research limitations/implications: Future work includes validating the implementation model in healthcare environments around the world. Originality/value: This paper presents a novel application of a hybrid KB combines of GAP and AHP methodology. It implements L6σ principles to enhance healthcare performance. This approach assists healthcare leaders’ decision making to reach performance improvement against a best practice benchmark.

Keywords: Lean Six Sigma (L6σ), Knowledge-Based System (KBS), healthcare leadership, Gauge Absence Prerequisites (GAP), Analytical Hierarchy Process (AHP)

Procedia PDF Downloads 166
26755 Comparison of Seismic Response for Two RC Curved Bridges with Different Column Shapes

Authors: Nina N. Serdar, Jelena R. Pejović

Abstract:

This paper presents seismic risk assessment of two bridge structure, based on the probabilistic performance-based seismic assessment methodology. Both investigated bridges are tree span continuous RC curved bridges with the difference in column shapes. First bridge (type A) has a wall-type pier and second (type B) has a two-column bent with circular columns. Bridges are designed according to European standards: EN 1991-2, EN1992-1-1 and EN 1998-2. Aim of the performed analysis is to compare seismic behavior of these two structures and to detect the influence of column shapes on the seismic response. Seismic risk assessment is carried out by obtaining demand fragility curves. Non-linear model was constructed and time-history analysis was performed using thirty five pairs of horizontal ground motions selected to match site specific hazard. In performance based analysis, peak column drift ratio (CDR) was selected as engineering demand parameter (EDP). For seismic intensity measure (IM) spectral displacement was selected. Demand fragility curves that give probability of exceedance of certain value for chosen EDP were constructed and based on them conclusions were made.

Keywords: RC curved bridge, demand fragility curve, wall type column, nonlinear time-history analysis, circular column

Procedia PDF Downloads 341
26754 Motion Performance Analyses and Trajectory Planning of the Movable Leg-Foot Lander

Authors: Shan Jia, Jinbao Chen, Jinhua Zhou, Jiacheng Qian

Abstract:

In response to the functional limitations of the fixed landers, those are to expand the detection range by the use of wheeled rovers with unavoidable path-repeatability in deep space exploration currently, a movable lander based on the leg-foot walking mechanism is presented. Firstly, a quadruped landing mechanism based on pushrod-damping is proposed. The configuration is of the bionic characteristics such as hip, knee and ankle joints, and the multi-function main/auxiliary buffers based on the crumple-energy absorption and screw-nut mechanism. Secondly, the workspace of the end of the leg-foot mechanism is solved by Monte Carlo method, and the key points on the desired trajectory of the end of the leg-foot mechanism are fitted by cubic spline curve. Finally, an optimal time-jerk trajectory based on weight coefficient is planned and analyzed by an adaptive genetic algorithm (AGA). The simulation results prove the rationality and stability of walking motion of the movable leg-foot lander in the star catalogue. In addition, this research can also provide a technical solution integrating of soft-landing, large-scale inspection and material transfer for future star catalogue exploration, and can even serve as the technical basis for developing the reusable landers.

Keywords: motion performance, trajectory planning, movable, leg-foot lander

Procedia PDF Downloads 139
26753 Investigating the Challenges Faced by English Language Teachers in Implementing Outcome Based Education the Outcome Based Education model in Engineering Universities of Sindh

Authors: Habibullah Pathan

Abstract:

The present study aims to explore problems faced by English Language Teachers (ELT) while implementing the Outcome Based Education (OBE) model in engineering universities of Sindh. OBE is an emerging model initiative of the International Engineering Alliance. Traditional educational systems are teacher-centered or curriculum-centered, in which learners are not able to achieve desired outcomes, but the OBE model enables learners to know the outcomes before the start of the program. OBE is a circular process that begins from the needs and demands of society to stakeholders who ask the experts to produce the alumnus who can fulfill the needs and ends up getting new enrollment in the respective programs who can work according to the demands. In all engineering institutions, engineering courses besides English language courses are taught on the OBE model. English language teachers were interviewed to learn the in-depth of the problems faced by them. The study found that teachers were facing problems including pedagogical, OBE training, assessment, evaluation and administrative support. This study will be a guide for public and private English language teachers to cope with these challenges while teaching the English language on the OBE model. OBE is an emerging model by which the institutions can produce such a product that can meet the demands.

Keywords: problems of ELT teachers, outcome based education (OBE), implementing, assessment

Procedia PDF Downloads 98
26752 Agent-Based Modeling to Simulate the Dynamics of Health Insurance Markets

Authors: Haripriya Chakraborty

Abstract:

The healthcare system in the United States is considered to be one of the most inefficient and expensive systems when compared to other developed countries. Consequently, there are persistent concerns regarding the overall functioning of this system. For instance, the large number of uninsured individuals and high premiums are pressing issues that are shown to have a negative effect on health outcomes with possible life-threatening consequences. The Affordable Care Act (ACA), which was signed into law in 2010, was aimed at improving some of these inefficiencies. This paper aims at providing a computational mechanism to examine some of these inefficiencies and the effects that policy proposals may have on reducing these inefficiencies. Agent-based modeling is an invaluable tool that provides a flexible framework to model complex systems. It can provide an important perspective into the nature of some interactions that occur and how the benefits of these interactions are allocated. In this paper, we propose a novel and versatile agent-based model with realistic assumptions to simulate the dynamics of a health insurance marketplace that contains a mixture of private and public insurers and individuals. We use this model to analyze the characteristics, motivations, payoffs, and strategies of these agents. In addition, we examine the effects of certain policies, including some of the provisions of the ACA, aimed at reducing the uninsured rate and the cost of premiums to move closer to a system that is more equitable and improves health outcomes for the general population. Our test results confirm the usefulness of our agent-based model in studying this complicated issue and suggest some implications for public policies aimed at healthcare reform.

Keywords: agent-based modeling, healthcare reform, insurance markets, public policy

Procedia PDF Downloads 138
26751 A Design of an Augmented Reality Based Virtual Heritage Application

Authors: Stephen Barnes, Ian Mills, Frances Cleary

Abstract:

Augmented and virtual reality-based applications offer many benefits for the heritage and tourism sector. This technology provides a platform to showcase the regions of interest to people without the need for them to be physically present, which has had a positive impact on enticing tourists to visit those locations. However, the technology also provides the opportunity to present historical artefacts in a form that accurately represents their original, intended appearance. Three sites of interest were identified in the Lingaun Valley in South East Ireland, wherein virtual representations of site-specific artefacts of interest were created via a multidisciplinary team encompassing archaeology, art history, 3D modelling, design, and software development. The collated information has been presented to users via an augmented reality mobile-based application that provides information in an engaging manner that encourages an interest in history as well as visits to the sites in the Lingaun Valley.

Keywords: augmented reality, virtual heritage, 3D modelling, archaeology, virtual representation

Procedia PDF Downloads 81
26750 The Impact of Preference-Based Employee Deployment toward Employee Satisfaction and Organizational Performance: Case Study in Directorate General of State Asset Management, Ministry of Finance of the Republic of Indonesia

Authors: Rahmat Irawan, Mundhir Hanifsyam Harahap, Andar Ristabet Hesda

Abstract:

As a public sector organization in Indonesia, Directorate General of State Asset Management (DGSAM) which is a unit under the Ministry of Finance of The Republic of Indonesia, has many constraints in managing its employees. While private organizations are able to conduct a human resource management as the best practice, DGSAM is limited by many regulations, especially about punishment and lay off policy for under-performance employees. Therefore, since 2015, DGSAM tries to implement a new and uncommon approach considering employees’ preference to encourage the motivation and performance of employees. DGSAM’s employees may propose the job places, and DGSAM considers them in deciding employees deployment. This study tries to determine the impact of preference-based approach toward employees’ satisfaction and organizational performance. This study uses quantitative approaches by regression analysis to measure the impact of deployment toward satisfaction of deployed employees and performance change of related units in DGSAM. The result of this study shows that preference-based approach significantly improves employees’ satisfaction and performance of related units as well. Based on the results of this study, it can be suggested that the approach is able to be implemented in the wider scope of the Ministry of Finance of The Republic of Indonesia and whole public sector organization in Indonesia. However, this study only focuses on short term measurement, so it is suggested to do further study to analyze the long-term impact.

Keywords: employee deployment, employee satisfaction, human resource management, organizational performance, preference-based approach

Procedia PDF Downloads 332
26749 Immobilization of Lipase Enzyme by Low Cost Material: A Statistical Approach

Authors: Md. Z. Alam, Devi R. Asih, Md. N. Salleh

Abstract:

Immobilization of lipase enzyme produced from palm oil mill effluent (POME) by the activated carbon (AC) among the low cost support materials was optimized. The results indicated that immobilization of 94% was achieved by AC as the most suitable support material. A sequential optimization strategy based on a statistical experimental design, including one-factor-at-a-time (OFAT) method was used to determine the equilibrium time. Three components influencing lipase immobilization were optimized by the response surface methodology (RSM) based on the face-centered central composite design (FCCCD). On the statistical analysis of the results, the optimum enzyme concentration loading, agitation rate and carbon active dosage were found to be 30 U/ml, 300 rpm and 8 g/L respectively, with a maximum immobilization activity of 3732.9 U/g-AC after 2 hrs of immobilization. Analysis of variance (ANOVA) showed a high regression coefficient (R2) of 0.999, which indicated a satisfactory fit of the model with the experimental data. The parameters were statistically significant at p<0.05.

Keywords: activated carbon, POME based lipase, immobilization, adsorption

Procedia PDF Downloads 243
26748 A Decision Support System to Detect the Lumbar Disc Disease on the Basis of Clinical MRI

Authors: Yavuz Unal, Kemal Polat, H. Erdinc Kocer

Abstract:

In this study, a decision support system comprising three stages has been proposed to detect the disc abnormalities of the lumbar region. In the first stage named the feature extraction, T2-weighted sagittal and axial Magnetic Resonance Images (MRI) were taken from 55 people and then 27 appearance and shape features were acquired from both sagittal and transverse images. In the second stage named the feature weighting process, k-means clustering based feature weighting (KMCBFW) proposed by Gunes et al. Finally, in the third stage named the classification process, the classifier algorithms including multi-layer perceptron (MLP- neural network), support vector machine (SVM), Naïve Bayes, and decision tree have been used to classify whether the subject has lumbar disc or not. In order to test the performance of the proposed method, the classification accuracy (%), sensitivity, specificity, precision, recall, f-measure, kappa value, and computation times have been used. The best hybrid model is the combination of k-means clustering based feature weighting and decision tree in the detecting of lumbar disc disease based on both sagittal and axial MR images.

Keywords: lumbar disc abnormality, lumbar MRI, lumbar spine, hybrid models, hybrid features, k-means clustering based feature weighting

Procedia PDF Downloads 520