Search results for: coverage measure
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3850

Search results for: coverage measure

3580 Extended Intuitionistic Fuzzy VIKOR Method in Group Decision Making: The Case of Vendor Selection Decision

Authors: Nastaran Hajiheydari, Mohammad Soltani Delgosha

Abstract:

Vendor (supplier) selection is a group decision-making (GDM) process, in which, based on some predetermined criteria, the experts’ preferences are provided in order to rank and choose the most desirable suppliers. In the real business environment, our attitudes or our choices would be made in an uncertain and indecisive situation could not be expressed in a crisp framework. Intuitionistic fuzzy sets (IFSs) could handle such situations in the best way. VIKOR method was developed to solve multi-criteria decision-making (MCDM) problems. This method, which is used to determine the compromised feasible solution with respect to the conflicting criteria, introduces a multi-criteria ranking index based on the particular measure of 'closeness' to the 'ideal solution'. Until now, there has been a little investigation of VIKOR with IFS, therefore we extended the intuitionistic fuzzy (IF) VIKOR to solve vendor selection problem under IF GDM environment. The present study intends to develop an IF VIKOR method in a GDM situation. Therefore, a model is presented to calculate the criterion weights based on entropy measure. Then, the interval-valued intuitionistic fuzzy weighted geometric (IFWG) operator utilized to obtain the total decision matrix. In the next stage, an approach based on the positive idle intuitionistic fuzzy number (PIIFN) and negative idle intuitionistic fuzzy number (NIIFN) was developed. Finally, the application of the proposed method to solve a vendor selection problem illustrated.

Keywords: group decision making, intuitionistic fuzzy set, intuitionistic fuzzy entropy measure, vendor selection, VIKOR

Procedia PDF Downloads 149
3579 A Three Elements Vector Valued Structure’s Ultimate Strength-Strong Motion-Intensity Measure

Authors: A. Nicknam, N. Eftekhari, A. Mazarei, M. Ganjvar

Abstract:

This article presents an alternative collapse capacity intensity measure in the three elements form which is influenced by the spectral ordinates at periods longer than that of the first mode period at near and far source sites. A parameter, denoted by β, is defined by which the spectral ordinate effects, up to the effective period (2T_1), on the intensity measure are taken into account. The methodology permits to meet the hazard-levelled target extreme event in the probabilistic and deterministic forms. A MATLAB code is developed involving OpenSees to calculate the collapse capacities of the 8 archetype RC structures having 2 to 20 stories for regression process. The incremental dynamic analysis (IDA) method is used to calculate the structure’s collapse values accounting for the element stiffness and strength deterioration. The general near field set presented by FEMA is used in a series of performing nonlinear analyses. 8 linear relationships are developed for the 8structutres leading to the correlation coefficient up to 0.93. A collapse capacity near field prediction equation is developed taking into account the results of regression processes obtained from the 8 structures. The proposed prediction equation is validated against a set of actual near field records leading to a good agreement. Implementation of the proposed equation to the four archetype RC structures demonstrated different collapse capacities at near field site compared to those of FEMA. The reasons of differences are believed to be due to accounting for the spectral shape effects.

Keywords: collapse capacity, fragility analysis, spectral shape effects, IDA method

Procedia PDF Downloads 229
3578 FRATSAN: A New Software for Fractal Analysis of Signals

Authors: Hamidreza Namazi

Abstract:

Fractal analysis is assessing fractal characteristics of data. It consists of several methods to assign fractal characteristics to a dataset which may be a theoretical dataset or a pattern or signal extracted from phenomena including natural geometric objects, sound, market fluctuations, heart rates, digital images, molecular motion, networks, etc. Fractal analysis is now widely used in all areas of science. An important limitation of fractal analysis is that arriving at an empirically determined fractal dimension does not necessarily prove that a pattern is fractal; rather, other essential characteristics have to be considered. For this purpose a Visual C++ based software called FRATSAN (FRActal Time Series ANalyser) was developed which extract information from signals through three measures. These measures are Fractal Dimensions, Jeffrey’s Measure and Hurst Exponent. After computing these measures, the software plots the graphs for each measure. Besides computing three measures the software can classify whether the signal is fractal or no. In fact, the software uses a dynamic method of analysis for all the measures. A sliding window is selected with a value equal to 10% of the total number of data entries. This sliding window is moved one data entry at a time to obtain all the measures. This makes the computation very sensitive to slight changes in data, thereby giving the user an acute analysis of the data. In order to test the performance of this software a set of EEG signals was given as input and the results were computed and plotted. This software is useful not only for fundamental fractal analysis of signals but can be used for other purposes. For instance by analyzing the Hurst exponent plot of a given EEG signal in patients with epilepsy the onset of seizure can be predicted by noticing the sudden changes in the plot.

Keywords: EEG signals, fractal analysis, fractal dimension, hurst exponent, Jeffrey’s measure

Procedia PDF Downloads 462
3577 Globalisation, Growth and Sustainability in Sub-Saharan Africa

Authors: Ourvashi Bissoon

Abstract:

Sub-Saharan Africa in addition to being resource rich is increasingly being seen as having a huge growth potential and as a result, is increasingly attracting MNEs on its soil. To empirically assess the effectiveness of GDP in tracking sustainable resource use and the role played by MNEs in Sub-Saharan Africa, a panel data analysis has been undertaken for 32 countries over thirty-five years. The time horizon spans the period 1980-2014 to reflect the evolution from before the publication of the pioneering Brundtland report on sustainable development to date. Multinationals’ presence is proxied by the level of FDI stocks. The empirical investigation first focuses on the impact of trade openness and MNE presence on the traditional measure of economic growth namely the GDP growth rate, and then on the genuine savings (GS) rate, a measure of weak sustainability developed by the World Bank, which assumes the substitutability between different forms of capital and finally, the impact on the adjusted Net National Income (aNNI), a measure of green growth which caters for the depletion of natural resources is examined. For countries with significant exhaustible natural resources and important foreign investor presence, the adjusted net national income (aNNI) can be a better indicator of economic performance than GDP growth (World Bank, 2010). The issue of potential endogeneity and reverse causality is also addressed in addition to robustness tests. The findings indicate that FDI and openness contribute significantly and positively to the GDP growth of the countries in the sample; however there is a threshold level of institutional quality below which FDI has a negative impact on growth. When the GDP growth rate is substituted for the GS rate, a natural resource curse becomes evident. The rents being generated from the exploitation of natural resources are not being re-invested into other forms of capital namely human and physical capital. FDI and trade patterns may be setting the economies in the sample on a unsustainable path of resource depletion. The resource curse is confirmed when utilising the aNNI as well, thus implying that GDP growth measure may not be a reliable to capture sustainable development.

Keywords: FDI, sustainable development, genuine savings, sub-Saharan Africa

Procedia PDF Downloads 210
3576 Evaluation of a Piecewise Linear Mixed-Effects Model in the Analysis of Randomized Cross-over Trial

Authors: Moses Mwangi, Geert Verbeke, Geert Molenberghs

Abstract:

Cross-over designs are commonly used in randomized clinical trials to estimate efficacy of a new treatment with respect to a reference treatment (placebo or standard). The main advantage of using cross-over design over conventional parallel design is its flexibility, where every subject become its own control, thereby reducing confounding effect. Jones & Kenward, discuss in detail more recent developments in the analysis of cross-over trials. We revisit the simple piecewise linear mixed-effects model, proposed by Mwangi et. al, (in press) for its first application in the analysis of cross-over trials. We compared performance of the proposed piecewise linear mixed-effects model with two commonly cited statistical models namely, (1) Grizzle model; and (2) Jones & Kenward model, used in estimation of the treatment effect, in the analysis of randomized cross-over trial. We estimate two performance measurements (mean square error (MSE) and coverage probability) for the three methods, using data simulated from the proposed piecewise linear mixed-effects model. Piecewise linear mixed-effects model yielded lowest MSE estimates compared to Grizzle and Jones & Kenward models for both small (Nobs=20) and large (Nobs=600) sample sizes. It’s coverage probability were highest compared to Grizzle and Jones & Kenward models for both small and large sample sizes. A piecewise linear mixed-effects model is a better estimator of treatment effect than its two competing estimators (Grizzle and Jones & Kenward models) in the analysis of cross-over trials. The data generating mechanism used in this paper captures two time periods for a simple 2-Treatments x 2-Periods cross-over design. Its application is extendible to more complex cross-over designs with multiple treatments and periods. In addition, it is important to note that, even for single response models, adding more random effects increases the complexity of the model and thus may be difficult or impossible to fit in some cases.

Keywords: Evaluation, Grizzle model, Jones & Kenward model, Performance measures, Simulation

Procedia PDF Downloads 117
3575 A Study on an Evacuation Test to Measure Delay Time in Using an Evacuation Elevator

Authors: Kyungsuk Cho, Seungun Chae, Jihun Choi

Abstract:

Elevators are examined as one of evacuation methods in super-tall buildings. However, data on the use of elevators for evacuation at a fire are extremely scarce. Therefore, a test to measure delay time in using an evacuation elevator was conducted. In the test, time taken to get on and get off an elevator was measured and the case in which people gave up boarding when the capacity of the elevator was exceeded was also taken into consideration. 170 men and women participated in the test, 130 of whom were young people (20 ~ 50 years old) and 40 were senior citizens (over 60 years old). The capacity of the elevator was 25 people and it travelled between the 2nd and 4th floors. A video recording device was used to analyze the test. An elevator at an ordinary building, not a super-tall building, was used in the test to measure delay time in getting on and getting off an elevator. In order to minimize interference from other elements, elevator platforms on the 2nd and 4th floors were partitioned off. The elevator travelled between the 2nd and 4th floors where people got on and off. If less than 20 people got on the elevator which was empty, the data were excluded. If the elevator carrying 10 passengers stopped and less than 10 new passengers got on the elevator, the data were excluded. Getting-on an empty elevator was observed 49 times. The average number of passengers was 23.7, it took 14.98 seconds for the passengers to get on the empty elevator and the load factor was 1.67 N/s. It took the passengers, whose average number was 23.7, 10.84 seconds to get off the elevator and the unload factor was 2.33 N/s. When an elevator’s capacity is exceeded, the excessive number of people should get off. Time taken for it and the probability of the case were measure in the test. 37% of the times of boarding experienced excessive number of people. As the number of people who gave up boarding increased, the load factor of the ride decreased. When 1 person gave up boarding, the load factor was 1.55 N/s. The case was observed 10 times, which was 12.7% of the total. When 2 people gave up boarding, the load factor was 1.15 N/s. The case was observed 7 times, which was 8.9% of the total. When 3 people gave up boarding, the load factor was 1.26 N/s. The case was observed 4 times, which was 5.1% of the total. When 4 people gave up boarding, the load factor was 1.03 N/s. The case was observed 5 times, which was 6.3% of the total. Getting-on and getting-off time data for people who can walk freely were obtained from the test. In addition, quantitative results were obtained from the relation between the number of people giving up boarding and time taken for getting on. This work was supported by the National Research Council of Science & Technology (NST) grant by the Korea government (MSIP) (No. CRC-16-02-KICT).

Keywords: evacuation elevator, super tall buildings, evacuees, delay time

Procedia PDF Downloads 172
3574 Resume Ranking Using Custom Word2vec and Rule-Based Natural Language Processing Techniques

Authors: Subodh Chandra Shakya, Rajendra Sapkota, Aakash Tamang, Shushant Pudasaini, Sujan Adhikari, Sajjan Adhikari

Abstract:

Lots of efforts have been made in order to measure the semantic similarity between the text corpora in the documents. Techniques have been evolved to measure the similarity of two documents. One such state-of-art technique in the field of Natural Language Processing (NLP) is word to vector models, which converts the words into their word-embedding and measures the similarity between the vectors. We found this to be quite useful for the task of resume ranking. So, this research paper is the implementation of the word2vec model along with other Natural Language Processing techniques in order to rank the resumes for the particular job description so as to automate the process of hiring. The research paper proposes the system and the findings that were made during the process of building the system.

Keywords: chunking, document similarity, information extraction, natural language processing, word2vec, word embedding

Procedia PDF Downloads 150
3573 Relation between Chronic Mechanical Low Back Pain and Hip Rotation

Authors: Mohamed M. Diab, Koura G. Mohamed, A. Balbaa, Radwan Sh. Ahamed

Abstract:

Background: Chronic mechanical low back pain (CMLBP) is the most common complaint of the working-age population. Mechanical low back pain is often a chronic, dull, aching pain of varying intensity that affects the lower spine. In the current proposal the hip rotation-CMLBP relationship is based on that limited hip motion will be compensated by motion in the lumbopelvic region and this increase force translates to the lumbar spine. The purpose of this study was to investigate if there a relationship between chronic mechanical low back pain (CMLBP) and hip medial and lateral rotation (peak torque and Range of motion (ROM) in patients with CMLBP. Methods: Sixty patients with CMLBP diagnosed by an orthopedist participated in the current study after signing a consent form. Their mean of age was (23.76±2.39) years, mean of weight (71.8±12.7) (Kg), mean of height (169.65±7.49) (Cm) and mean of BMI (25.5±3.86) (Kg/m2). Visual Analogue Scale (VAS) was used to assess pain. Fluid Filled Inclinometer was used to measure Hip rotation ROM (medial and lateral). Isokinetic Dynamometer was used to measure peak torque of hip rotators muscles (medial and lateral), concentric peak torque with tow Isokinetic speeds (60ᵒ/sec and 180ᵒ/sec) was selected to measure peak torque. Results: The results of this study demonstrated that there is poor relationship between pain and hip external rotation ROM, also there is poor relation between pain and hip internal rotation ROM. There is poor relation between pain and hip internal rotators peak torque and hip external rotators peak torque in both speeds. Conclusion: Depending on the current study it is not recommended to give an importance to hip rotation in treating Chronic Mechanical Low Back Pain.

Keywords: hip rotation ROM, hip rotators strength, low back pain, chronic mechanical

Procedia PDF Downloads 303
3572 A Context-Sensitive Algorithm for Media Similarity Search

Authors: Guang-Ho Cha

Abstract:

This paper presents a context-sensitive media similarity search algorithm. One of the central problems regarding media search is the semantic gap between the low-level features computed automatically from media data and the human interpretation of them. This is because the notion of similarity is usually based on high-level abstraction but the low-level features do not sometimes reflect the human perception. Many media search algorithms have used the Minkowski metric to measure similarity between image pairs. However those functions cannot adequately capture the aspects of the characteristics of the human visual system as well as the nonlinear relationships in contextual information given by images in a collection. Our search algorithm tackles this problem by employing a similarity measure and a ranking strategy that reflect the nonlinearity of human perception and contextual information in a dataset. Similarity search in an image database based on this contextual information shows encouraging experimental results.

Keywords: context-sensitive search, image search, similarity ranking, similarity search

Procedia PDF Downloads 359
3571 Where do Pregnant Women Miss Out on Nutrition? Analysis of Survey Data from 22 Countries

Authors: Alexis D'Agostino, Celeste Sununtunasuk, Jack Fiedler

Abstract:

Background: Iron-folic acid (IFA) supplementation during antenatal care (ANC) has existed in many countries for decades. Despite this, low national coverage persists and women do not often consume appropriate amounts during pregnancy. USAID’s SPRING Project investigated pregnant women’s access to, and consumption of, IFA tablets through ANC. Cross-country analysis provided a global picture of the state of IFA-supplementation, while country-specific results noted key contextual issues, including geography, wealth, and ANC attendance. The analysis can help countries prioritize strategies for systematic performance improvements within one of the most common micronutrient supplementation programs aimed at reducing maternal anemia. Methodology: Using falter point analysis on Demographic and Health Survey (DHS) data collected from 162,958 women across 22 countries, SPRING identified four sequential falter points (ANC attendance, IFA receipt or purchase, IFA consumption, and number of tablets taken) where pregnant women fell out of the IFA distribution structure. SPRING analyzed data on IFA intake from DHS surveys with women of reproductive age. SPRING disaggregated these data by ANC participation during the most recent pregnancy, residency, and women’s socio-economic status. Results: Average sufficient IFA tablet use across all countries was only eight percent. Even in the best performing countries, only about one-third of pregnant women consumed 180 or more IFA tablets during their most recent pregnancy. ANC attendance was an important falter point for a quarter of women across all countries (with highest falter rates in Democratic Republic of the Congo, Nigeria, and Niger). Further analysis reveals patterns, with some countries having high ANC coverage but low IFA provision during ANC (DRC and Haiti), others having high ANC coverage and IFA provision but few women taking any tablets (Nigeria and Liberia), and countries that perform well in ANC, supplies, and initial consumption but where very few women consume the recommended 180 tablets (Malawi and Cambodia). Country-level analysis identifies further patterns of supplementation. In Indonesia, for example, only 62% of women in the poorest quintile took even one IFA tablet, while 86% of the wealthiest women did. This association between socioeconomic status and IFA intake held across nearly all countries where these data are available and was also visible in rural/urban comparisons. Analysis of ANC attendance data also suggests that higher numbers of ANC visits are associated with higher tablet intake. Conclusions: While it is difficult to disentangle which specific aspects of supply or demand cause the low rates of consumption, this tool allows policy-makers to identify major bottlenecks to scaling-up IFA supplementation during ANC. In turn, each falter point provides possible explanations of program performance and helps strategically identify areas for improved IFA supplementation. For example, improving the delivery of IFA supplementation in Ethiopia relies on increasing access to ANC, but also on identifying and addressing program gaps in IFA supply management and health workers’ practices in order to provide quality ANC services. While every country requires a customized approach to improving IFA supplementation, the multi-country analysis conducted by SPRING is a helpful first step in identifying country bottlenecks and prioritizing interventions.

Keywords: iron and folic acid, supplementation, antenatal care, micronutrient

Procedia PDF Downloads 390
3570 Empirical Exploration of Correlations between Software Design Measures: A Replication Study

Authors: Jehad Al Dallal

Abstract:

Software engineers apply different measures to quantify the quality of software design. These measures consider artifacts developed at low or high level software design phases. The results are used to point to design weaknesses and to indicate design points that have to be restructured. Understanding the relationship among the quality measures and among the design quality aspects considered by these measures is important to interpreting the impact of a measure for a quality aspect on other potentially related aspects. In addition, exploring the relationship between quality measures helps to explain the impact of different quality measures on external quality aspects, such as reliability and maintainability. In this paper, we report a replication study that empirically explores the correlation between six well known and commonly applied design quality measures. These measures consider several quality aspects, including complexity, cohesion, coupling, and inheritance. The results indicate that inheritance measures are weakly correlated to other measures, whereas complexity, coupling, and cohesion measures are mostly strongly correlated.  

Keywords: quality attribute, quality measure, software design quality, Spearman correlation

Procedia PDF Downloads 292
3569 Portable, Noninvasive and Wireless Near Infrared Spectroscopy Device to Monitor Skeletal Muscle Metabolism during Exercise

Authors: Adkham Paiziev, Fikrat Kerimov

Abstract:

Near Infrared Spectroscopy (NIRS) is one of the biophotonic techniques which can be used to monitor oxygenation and hemodynamics in a variety of human tissues, including skeletal muscle. In the present work, we are offering tissue oximetry (OxyPrem) to measure hemodynamic parameters of skeletal muscles in rest and exercise. Purpose: - To elaborate the new wireless, portable, noninvasive, wearable NIRS device to measure skeletal muscle oxygenation during exercise. - To test this device on brachioradialis muscle of wrestler volunteers by using combined method of arterial occlusion (AO) and NIRS (AO+NIRS). Methods: Oxyprem NIRS device has been used together with AO test. AO test and Isometric brachioradialis muscle contraction experiments have been performed on one group of wrestler volunteers. ‘Accu- Measure’ caliper (USA) to measure skinfold thickness (SFT) has been used. Results: Elaborated device consists on power supply box, a sensor head and installed ‘Tubis’ software for data acquisition and to compute deoxyhemoglobin ([HHb), oxyhemoglobin ([O2Hb]), tissue oxygenation (StO2) and muscle tissue oxygen consumption (mVO2). Sensor head consists on four light sources with three light emitting diodes with nominal wavelengths of 760 nm, 805 nm, and 870 nm, and two detectors. AO and isometric voluntary forearm muscle contraction (IVFMC) on five healthy male subjects (23,2±0.84 in age, 0.43±0.05cm of SFT ) and four female subjects (22.0±1.0 in age and 0.24±0.04 cm SFT) has been measured. mVO2 for control group has been calculated (-0.65%/sec±0.07) for male and -0.69%/±0.19 for female subjects). Tissue oxygenation index for wrestlers in average about 75% whereas for control group StO2 =63%. Second experiment was connected with quality monitoring muscle activity during IVFMC at 10%,30% and 50% of MVC. It has been shown, that the concentration changes of HbO2 and HHb positively correlated to the contraction intensity. Conclusion: We have presented a portable multi-channel wireless NIRS device for real-time monitoring of muscle activity. The miniaturized NIRS sensor and the usage of wireless communication make the whole device have a compact-size, thus can be used in muscle monitoring.

Keywords: skeletal muscle, oxygenation, instrumentation, near infrared spectroscopy

Procedia PDF Downloads 272
3568 Q-Test of Undergraduate Epistemology and Scientific Thought: Development and Testing of an Assessment of Scientific Epistemology

Authors: Matthew J. Zagumny

Abstract:

The QUEST is an assessment of scientific epistemic beliefs and was developed to measure students’ intellectual development in regards to beliefs about knowledge and knowing. The QUEST utilizes Q-sort methodology, which requires participants to rate the degree to which statements describe them personally. As a measure of personal theories of knowledge, the QUEST instrument is described with the Q-sort distribution and scoring explained. A preliminary demonstration of the QUEST assessment is described with two samples of undergraduate students (novice/lower division compared to advanced/upper division students) being assessed and their average QUEST scores compared. The usefulness of an assessment of epistemology is discussed in terms of the principle that assessment tends to drive educational practice and university mission. The critical need for university and academic programs to focus on development of students’ scientific epistemology is briefly discussed.

Keywords: scientific epistemology, critical thinking, Q-sort method, STEM undergraduates

Procedia PDF Downloads 370
3567 Bi-Axial Stress Effects on Barkhausen-Noise

Authors: G. Balogh, I. A. Szabó, P.Z. Kovács

Abstract:

Mechanical stress has a strong effect on the magnitude of the Barkhausen-noise in structural steels. Because the measurements are performed at the surface of the material, for a sample sheet, the full effect can be described by a biaxial stress field. The measured Barkhausen-noise is dependent on the orientation of the exciting magnetic field relative to the axis of the stress tensor. The sample inhomogenities including the residual stress also modifies the angular dependence of the measured Barkhausen-noise. We have developed a laboratory device with a cross like specimen for bi-axial bending. The measuring head allowed performing excitations in two orthogonal directions. We could excite the two directions independently or simultaneously with different amplitudes. The simultaneous excitation of the two coils could be performed in phase or with a 90 degree phase shift. In principle this allows to measure the Barkhausen-noise at an arbitrary direction without moving the head, or to measure the Barkhausen-noise induced by a rotating magnetic field if a linear superposition of the two fields can be assumed.

Keywords: Barkhausen-noise, bi-axial stress, stress measuring, stress dependency

Procedia PDF Downloads 290
3566 Bienzymatic Nanocomposites Biosensors Complexed with Gold Nanoparticles, Polyaniline, Recombinant MN Peroxidase from Corn, and Glucose Oxidase to Measure Glucose

Authors: Anahita Izadyar

Abstract:

Using a recombinant enzyme derived from corn and a simple modification, we are fabricating a facile, fast, and cost-beneficial novel biosensor to measure glucose. We are applying Plant Produced Mn Peroxidase (PPMP), glucose oxidase (GOx), polyaniline (PANI) as conductive polymer and gold nanoparticles (AuNPs) on Au electrode using electrochemical response to detect glucose. We applied the entrapment method of enzyme composition, which is generally used to immobilize conductive polymer and facilitate electron transfer from the enzyme oxidation-reduction center to the sample solution. In this work, the oxidation of glucose on the modified gold electrode was quantified with Linear Sweep Voltammetry(LSV). We expect that the modified biosensor has the potential for monitoring various biofluids.

Keywords: plant-produced manganese peroxidase, enzyme-based biosensors, glucose, modified gold nanoparticles electrode, polyaniline

Procedia PDF Downloads 189
3565 Searching the Efficient Frontier for the Coherent Covering Location Problem

Authors: Felipe Azocar Simonet, Luis Acosta Espejo

Abstract:

In this article, we will try to find an efficient boundary approximation for the bi-objective location problem with coherent coverage for two levels of hierarchy (CCLP). We present the mathematical formulation of the model used. Supported efficient solutions and unsupported efficient solutions are obtained by solving the bi-objective combinatorial problem through the weights method using a Lagrangean heuristic. Subsequently, the results are validated through the DEA analysis with the GEM index (Global efficiency measurement).

Keywords: coherent covering location problem, efficient frontier, lagragian relaxation, data envelopment analysis

Procedia PDF Downloads 324
3564 Lean Impact Analysis Assessment Models: Development of a Lean Measurement Structural Model

Authors: Catherine Maware, Olufemi Adetunji

Abstract:

The paper is aimed at developing a model to measure the impact of Lean manufacturing deployment on organizational performance. The model will help industry practitioners to assess the impact of implementing Lean constructs on organizational performance. It will also harmonize the measurement models of Lean performance with the house of Lean that seems to have become the industry standard. The sheer number of measurement models for impact assessment of Lean implementation makes it difficult for new adopters to select an appropriate assessment model or deployment methodology. A literature review is conducted to classify the Lean performance model. Pareto analysis is used to select the Lean constructs for the development of the model. The model is further formalized through the use of Structural Equation Modeling (SEM) in defining the underlying latent structure of a Lean system. An impact assessment measurement model developed can be used to measure Lean performance and can be adopted by different industries.

Keywords: impact measurement model, lean bundles, lean manufacturing, organizational performance

Procedia PDF Downloads 477
3563 Dynamic Ambulance Deployment to Reduce Ambulance Response Times Using Geographic Information Systems

Authors: Masoud Swalehe, Semra Günay

Abstract:

Developed countries are losing many lives to non-communicable diseases as compared to their developing counterparts. The effects of these diseases are mostly sudden and manifest at a very short time prior to death or a dangerous attack and this has consolidated the significance of emergency medical system (EMS) as one of the vital areas of healthcare service delivery. The primary objective of this research is to reduce ambulance response times (RT) of Eskişehir province EMS since a number of studies have established a relationship between ambulance response times and survival chances of patients especially out of hospital cardiac arrest (OHCA) victims. It has been found out that patients who receive out of hospital medical attention in few (4) minutes after cardiac arrest because of low ambulance response times stand higher chances of survival than their counterparts who take longer times (more than 12 minutes) to receive out of hospital medical care because of higher ambulance response times. The study will make use of geographic information systems (GIS) technology to dynamically reallocate ambulance resources according to demand and time so as to reduce ambulance response times. Geospatial-time distribution of ambulance calls (demand) will be used as a basis for optimal ambulance deployment using system status management (SSM) strategy to achieve much demand coverage with the same number of ambulance resources to cause response time reduction. Drive-time polygons will be used to come up with time specific facility coverage areas and suggesting additional facility candidate sites where ambulance resources can be moved to serve higher demands making use of network analysis techniques. Emergency Ambulance calls’ data from 1st January 2014 to 31st December 2014 obtained from Eskişehir province health directorate will be used in this study. This study will focus on the reduction of ambulance response times which is a key Emergency Medical Services performance indicator.

Keywords: emergency medical services, system status management, ambulance response times, geographic information system, geospatial-time distribution, out of hospital cardiac arrest

Procedia PDF Downloads 294
3562 Using the Smith-Waterman Algorithm to Extract Features in the Classification of Obesity Status

Authors: Rosa Figueroa, Christopher Flores

Abstract:

Text categorization is the problem of assigning a new document to a set of predetermined categories, on the basis of a training set of free-text data that contains documents whose category membership is known. To train a classification model, it is necessary to extract characteristics in the form of tokens that facilitate the learning and classification process. In text categorization, the feature extraction process involves the use of word sequences also known as N-grams. In general, it is expected that documents belonging to the same category share similar features. The Smith-Waterman (SW) algorithm is a dynamic programming algorithm that performs a local sequence alignment in order to determine similar regions between two strings or protein sequences. This work explores the use of SW algorithm as an alternative to feature extraction in text categorization. The dataset used for this purpose, contains 2,610 annotated documents with the classes Obese/Non-Obese. This dataset was represented in a matrix form using the Bag of Word approach. The score selected to represent the occurrence of the tokens in each document was the term frequency-inverse document frequency (TF-IDF). In order to extract features for classification, four experiments were conducted: the first experiment used SW to extract features, the second one used unigrams (single word), the third one used bigrams (two word sequence) and the last experiment used a combination of unigrams and bigrams to extract features for classification. To test the effectiveness of the extracted feature set for the four experiments, a Support Vector Machine (SVM) classifier was tuned using 20% of the dataset. The remaining 80% of the dataset together with 5-Fold Cross Validation were used to evaluate and compare the performance of the four experiments of feature extraction. Results from the tuning process suggest that SW performs better than the N-gram based feature extraction. These results were confirmed by using the remaining 80% of the dataset, where SW performed the best (accuracy = 97.10%, weighted average F-measure = 97.07%). The second best was obtained by the combination of unigrams-bigrams (accuracy = 96.04, weighted average F-measure = 95.97) closely followed by the bigrams (accuracy = 94.56%, weighted average F-measure = 94.46%) and finally unigrams (accuracy = 92.96%, weighted average F-measure = 92.90%).

Keywords: comorbidities, machine learning, obesity, Smith-Waterman algorithm

Procedia PDF Downloads 290
3561 Flood Hazard Assessment and Land Cover Dynamics of the Orai Khola Watershed, Bardiya, Nepal

Authors: Loonibha Manandhar, Rajendra Bhandari, Kumud Raj Kafle

Abstract:

Nepal’s Terai region is a part of the Ganges river basin which is one of the most disaster-prone areas of the world, with recurrent monsoon flooding causing millions in damage and the death and displacement of hundreds of people and households every year. The vulnerability of human settlements to natural disasters such as floods is increasing, and mapping changes in land use practices and hydro-geological parameters is essential in developing resilient communities and strong disaster management policies. The objective of this study was to develop a flood hazard zonation map of Orai Khola watershed and map the decadal land use/land cover dynamics of the watershed. The watershed area was delineated using SRTM DEM, and LANDSAT images were classified into five land use classes (forest, grassland, sediment and bare land, settlement area and cropland, and water body) using pixel-based semi-automated supervised maximum likelihood classification. Decadal changes in each class were then quantified using spatial modelling. Flood hazard mapping was performed by assigning weights to factors slope, rainfall distribution, distance from the river and land use/land cover on the basis of their estimated influence in causing flood hazard and performing weighed overlay analysis to identify areas that are highly vulnerable. The forest and grassland coverage increased by 11.53 km² (3.8%) and 1.43 km² (0.47%) from 1996 to 2016. The sediment and bare land areas decreased by 12.45 km² (4.12%) from 1996 to 2016 whereas settlement and cropland areas showed a consistent increase to 14.22 km² (4.7%). Waterbody coverage also increased to 0.3 km² (0.09%) from 1996-2016. 1.27% (3.65 km²) of total watershed area was categorized into very low hazard zone, 20.94% (60.31 km²) area into low hazard zone, 37.59% (108.3 km²) area into moderate hazard zone, 29.25% (84.27 km²) area into high hazard zone and 31 villages which comprised 10.95% (31.55 km²) were categorized into high hazard zone area.

Keywords: flood hazard, land use/land cover, Orai river, supervised maximum likelihood classification, weighed overlay analysis

Procedia PDF Downloads 342
3560 Efficacy of Deep Learning for Below-Canopy Reconstruction of Satellite and Aerial Sensing Point Clouds through Fractal Tree Symmetry

Authors: Dhanuj M. Gandikota

Abstract:

Sensor-derived three-dimensional (3D) point clouds of trees are invaluable in remote sensing analysis for the accurate measurement of key structural metrics, bio-inventory values, spatial planning/visualization, and ecological modeling. Machine learning (ML) holds the potential in addressing the restrictive tradeoffs in cost, spatial coverage, resolution, and information gain that exist in current point cloud sensing methods. Terrestrial laser scanning (TLS) remains the highest fidelity source of both canopy and below-canopy structural features, but usage is limited in both coverage and cost, requiring manual deployment to map out large, forested areas. While aerial laser scanning (ALS) remains a reliable avenue of LIDAR active remote sensing, ALS is also cost-restrictive in deployment methods. Space-borne photogrammetry from high-resolution satellite constellations is an avenue of passive remote sensing with promising viability in research for the accurate construction of vegetation 3-D point clouds. It provides both the lowest comparative cost and the largest spatial coverage across remote sensing methods. However, both space-borne photogrammetry and ALS demonstrate technical limitations in the capture of valuable below-canopy point cloud data. Looking to minimize these tradeoffs, we explored a class of powerful ML algorithms called Deep Learning (DL) that show promise in recent research on 3-D point cloud reconstruction and interpolation. Our research details the efficacy of applying these DL techniques to reconstruct accurate below-canopy point clouds from space-borne and aerial remote sensing through learned patterns of tree species fractal symmetry properties and the supplementation of locally sourced bio-inventory metrics. From our dataset, consisting of tree point clouds obtained from TLS, we deconstructed the point clouds of each tree into those that would be obtained through ALS and satellite photogrammetry of varying resolutions. We fed this ALS/satellite point cloud dataset, along with the simulated local bio-inventory metrics, into the DL point cloud reconstruction architectures to generate the full 3-D tree point clouds (the truth values are denoted by the full TLS tree point clouds containing the below-canopy information). Point cloud reconstruction accuracy was validated both through the measurement of error from the original TLS point clouds as well as the error of extraction of key structural metrics, such as crown base height, diameter above root crown, and leaf/wood volume. The results of this research additionally demonstrate the supplemental performance gain of using minimum locally sourced bio-inventory metric information as an input in ML systems to reach specified accuracy thresholds of tree point cloud reconstruction. This research provides insight into methods for the rapid, cost-effective, and accurate construction of below-canopy tree 3-D point clouds, as well as the supported potential of ML and DL to learn complex, unmodeled patterns of fractal tree growth symmetry.

Keywords: deep learning, machine learning, satellite, photogrammetry, aerial laser scanning, terrestrial laser scanning, point cloud, fractal symmetry

Procedia PDF Downloads 96
3559 The Effects of Prosthetic Leg Stiffness on Gait, Comfort, and Satisfaction: A Review of Mechanical Engineering Approaches

Authors: Kourosh Fatehi, Niloofar Hanafi

Abstract:

One of the challenges in providing optimal prosthetic legs for lower limb amputees is to select the appropriate foot stiffness that suits their individual needs and preferences. Foot stiffness affects various aspects of walking, such as stability, comfort, and energy expenditure. However, the current prescription process is largely based on trial-and-error, manufacturer recommendations, or clinician judgment, which may not reflect the prosthesis user’s subjective experience or psychophysical sensitivity. Therefore, there is a need for more scientific and technological tools to measure and understand how prosthesis users perceive and prefer different foot stiffness levels, and how this preference relates to clinical outcomes. This review covers how to measure and design lower leg prostheses based on user preference and foot stiffness. It also explores how these factors affect walking outcomes and quality of life, and identifies the current challenges and gaps in this field from a mechanical engineering standpoint.

Keywords: perception, preference, prosthetics, stiffness

Procedia PDF Downloads 73
3558 Research Attitude: Its Factor Structure and Determinants in the Graduate Level

Authors: Janet Lynn S. Montemayor

Abstract:

Dropping survivability and rising drop-out rate in the graduate school is attributed to the demands that come along with research-related requirements. Graduate students tend to withdraw from their studies when confronted with such requirements. This act of succumbing to the challenge is primarily due to a negative mindset. An understanding of students’ view towards research is essential for teachers in facilitating research activities in the graduate school. This study aimed to develop a tool that accurately measures attitude towards research. Psychometric properties of the Research Attitude Inventory (RAIn) was assessed. A pool of items (k=50) was initially constructed and was administered to a development sample composed of Masters and Doctorate degree students (n=159). Results show that the RAIn is a reliable measure of research attitude (k=41, αmax = 0.894). Principal component analysis using orthogonal rotation with Kaiser normalization identified four underlying factors of research attitude, namely predisposition, purpose, perspective, and preparation. Research attitude among the respondents was analyzed using this measure.

Keywords: graduate education, principal component analysis, research attitude, scale development

Procedia PDF Downloads 189
3557 Probability Fuzzy Aggregation Operators in Vehicle Routing Problem

Authors: Anna Sikharulidze, Gia Sirbiladze

Abstract:

For the evaluation of unreliability levels of movement on the closed routes in the vehicle routing problem, the fuzzy operators family is constructed. The interactions between routing factors in extreme conditions on the roads are considered. A multi-criteria decision-making model (MCDM) is constructed. Constructed aggregations are based on the Choquet integral and the associated probability class of a fuzzy measure. Propositions on the correctness of the extension are proved. Connections between the operators and the compositions of dual triangular norms are described. The conjugate connections between the constructed operators are shown. Operators reflect interactions among all the combinations of the factors in the fuzzy MCDM process. Several variants of constructed operators are used in the decision-making problem regarding the assessment of unreliability and possibility levels of movement on closed routes.

Keywords: vehicle routing problem, associated probabilities of a fuzzy measure, choquet integral, fuzzy aggregation operator

Procedia PDF Downloads 320
3556 Assessment of Nigerian Newspapers' Reportage of Violence against Children: Case Study of Daily Sun and Punch National Newspapers

Authors: Adline Nkwam-Uwaoma, Mishack Ndukwu

Abstract:

Traditionally, child rearing in Nigeria closely reflects the ‘spare the rod and spoil the child’ maxim and as such spanking, flogging, slapping, beating and even starving a child as a form of punishment for wrongdoing and as a method of behaviour modification are common. These are not necessarily considered as maltreatment or abuse of the child. Despite the adoption and implementation of the child rights act in Nigeria, violence against children seems to be on a steady increase. Stories of sexual molestation, rape, child labour, infliction of physical injuries and use of children for rituals by parents, guardians or other members of the society abound. Violence against children is considered as those acts by other persons especially adults that undermine and threaten the healthy life and existence of children or those that violet their rights as humans. In Nigeria newspapers are a major source of News, second only to radio and television in coverage, currency and content. National dailies are newspapers with daily publications and national spread or coverage. This study analysed the frequency, length, prominence level, direction and sources of information reported on violence against children in the selected national daily newspapers. It then provided information on the role of the newspapers in Nigeria in the fight against child violence and public awareness of the impact of violence against children on the development of the nation and the attempts to curtail such violence. The composite week sampling technique in which the four weeks of the month are reduced to one and a sample is randomly selected from each day of the week was used. As such 168 editions of Daily Sun and Punch newspapers published from January to December of 2016 were selected. Data were collected using code sheet and analyzed via content analysis. The result showed that the frequency of the newspapers’ reportage of violence against children in Nigeria was low. Again, it was found that the length or space given to reports on violence against children was inadequate, the direction of the few reports on violence against children was in favour of the course or fight against child violence, and these newspapers gave no prominence to reports on violence against children. Finally, it was found that a major source of News about violence against children was through journalism; government and individual sources provided only minimal information.

Keywords: children, newspapers' reportage, Nigeria, violence

Procedia PDF Downloads 151
3555 Empirical Investigation for the Correlation between Object-Oriented Class Lack of Cohesion and Coupling

Authors: Jehad Al Dallal

Abstract:

The design of the internal relationships among object-oriented class members (i.e., attributes and methods) and the external relationships among classes affects the overall quality of the object-oriented software. The degree of relatedness among class members is referred to as class cohesion and the degree to which a class is related to other classes is called class coupling. Well designed classes are expected to exhibit high cohesion and low coupling values. In this paper, using classes of three open-source Java systems, we empirically investigate the relation between class cohesion and coupling. In the empirical study, five lack-of-cohesion metrics and eight coupling metrics are considered. The empirical study results show that class cohesion and coupling internal quality attributes are inversely correlated. The strength of the correlation highly depends on the cohesion and coupling measurement approaches.

Keywords: class cohesion measure, class coupling measure, object-oriented class, software quality

Procedia PDF Downloads 225
3554 Development of the New York Misophonia Scale: Implications for Diagnostic Criteria

Authors: Usha Barahmand, Maria Stalias, Abdul Haq, Esther Rotlevi, Ying Xiang

Abstract:

Misophonia is a condition in which specific repetitive oral, nasal, or other sounds and movements made by humans trigger impulsive aversive reactions of irritation or disgust that instantly become anger. A few measures exist for the assessment of misophonia, but each has some limitations, and evidence for a formal diagnosis is still lacking. The objective of this study was to develop a reliable and valid measure of misophonia for use in the general population. Adopting a purely descriptive approach, this study focused on developing a self-report measure using all triggers and reactions identified in previous studies on misophonia. A measure with two subscales, one assessing the aversive quality of various triggers and the other assessing reactions of individuals, was developed. Data were gathered from a large sample of both men and women ranging in age from 18 to 65 years. Exploratory factor analysis revealed three main triggers: oral/nasal sounds, hand and leg movements, and environmental sounds. Two clusters of reactions also emerged: nonangry attempts to avoid the impact of the aversive stimuli and angry attempts to stop the aversive stimuli. The examination of the psychometric properties of the scale revealed its internal consistency and test-retest reliability to be excellent. The scale was also found to have very good concurrent and convergent validity. Significant annoyance and disgust in response to the triggers were reported by 12% of the sample, although for some specific triggers, rates as high as 31% were also reported. These findings have implications for the delineation of the criteria for identifying misophonia as a clinical condition.

Keywords: adults, factor analysis, misophonia, psychometric properties, scale

Procedia PDF Downloads 193
3553 Valuation of Caps and Floors in a LIBOR Market Model with Markov Jump Risks

Authors: Shih-Kuei Lin

Abstract:

The characterization of the arbitrage-free dynamics of interest rates is developed in this study under the presence of Markov jump risks, when the term structure of the interest rates is modeled through simple forward rates. We consider Markov jump risks by allowing randomness in jump sizes, independence between jump sizes and jump times. The Markov jump diffusion model is used to capture empirical phenomena and to accurately describe interest jump risks in a financial market. We derive the arbitrage-free model of simple forward rates under the spot measure. Moreover, the analytical pricing formulas for a cap and a floor are derived under the forward measure when the jump size follows a lognormal distribution. In our empirical analysis, we find that the LIBOR market model with Markov jump risk better accounts for changes from/to different states and different rates.

Keywords: arbitrage-free, cap and floor, Markov jump diffusion model, simple forward rate model, volatility smile, EM algorithm

Procedia PDF Downloads 417
3552 Qualitative Measurement of Literacy

Authors: Indrajit Ghosh, Jaydip Roy

Abstract:

Literacy rate is an important indicator for measurement of human development. But this is not a good one to capture the qualitative dimension of educational attainment of an individual or a society. The overall educational level of an area is an important issue beyond the literacy rate. The overall educational level can be thought of as an outcome of the educational levels of individuals. But there is no well-defined algorithm and mathematical model available to measure the overall educational level of an area. A heuristic approach based on accumulated experience of experts is effective one. It is evident that fuzzy logic offers a natural and convenient framework in modeling various concepts in social science domain. This work suggests the implementation of fuzzy logic to develop a mathematical model for measurement of educational attainment of an area in terms of Education Index. The contribution of the study is two folds: conceptualization of “Education Profile” and proposing a new mathematical model to measure educational attainment in terms of “Education Index”.

Keywords: education index, education profile, fuzzy logic, literacy

Procedia PDF Downloads 312
3551 An Axiomatic Approach to Constructing an Applied Theory of Possibility

Authors: Oleksii Bychkov

Abstract:

The fundamental difference between randomness and vagueness is that the former requires statistical research. These issues were studied by Zadeh L, Dubois D., Prad A. The theory of possibility works with expert assessments, hypotheses, etc. gives an idea of the characteristics of the problem situation, the nature of the goals and real limitations. Possibility theory examines experiments that are not repeated. The article discusses issues related to the formalization of a fuzzy, uncertain idea of reality. The author proposes to expand the classical model of the theory of possibilities by introducing a measure of necessity. The proposed model of the theory of possibilities allows us to extend the measures of possibility and necessity onto a Boolean while preserving the properties of the measure. Thus, upper and lower estimates are obtained to describe the fact that the event will occur. Knowledge of the patterns that govern mass random, uncertain, fuzzy events allows us to predict how these events will proceed. The article proposed for publication quite fully reveals the essence of the construction and use of the theory of probability and the theory of possibility.

Keywords: possibility, artificial, modeling, axiomatics, intellectual approach

Procedia PDF Downloads 19