Search results for: random match probability
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3535

Search results for: random match probability

2605 Human Action Retrieval System Using Features Weight Updating Based Relevance Feedback Approach

Authors: Munaf Rashid

Abstract:

For content-based human action retrieval systems, search accuracy is often inferior because of the following two reasons 1) global information pertaining to videos is totally ignored, only low level motion descriptors are considered as a significant feature to match the similarity between query and database videos, and 2) the semantic gap between the high level user concept and low level visual features. Hence, in this paper, we propose a method that will address these two issues and in doing so, this paper contributes in two ways. Firstly, we introduce a method that uses both global and local information in one framework for an action retrieval task. Secondly, to minimize the semantic gap, a user concept is involved by incorporating features weight updating (FWU) Relevance Feedback (RF) approach. We use statistical characteristics to dynamically update weights of the feature descriptors so that after every RF iteration feature space is modified accordingly. For testing and validation purpose two human action recognition datasets have been utilized, namely Weizmann and UCF. Results show that even with a number of visual challenges the proposed approach performs well.

Keywords: relevance feedback (RF), action retrieval, semantic gap, feature descriptor, codebook

Procedia PDF Downloads 456
2604 Applications of Probabilistic Interpolation via Orthogonal Matrices

Authors: Dariusz Jacek Jakóbczak

Abstract:

Mathematics and computer science are interested in methods of 2D curve interpolation and extrapolation using the set of key points (knots). A proposed method of Hurwitz- Radon Matrices (MHR) is such a method. This novel method is based on the family of Hurwitz-Radon (HR) matrices which possess columns composed of orthogonal vectors. Two-dimensional curve is interpolated via different functions as probability distribution functions: polynomial, sinus, cosine, tangent, cotangent, logarithm, exponent, arcsin, arccos, arctan, arcctg or power function, also inverse functions. It is shown how to build the orthogonal matrix operator and how to use it in a process of curve reconstruction.

Keywords: 2D data interpolation, hurwitz-radon matrices, MHR method, probabilistic modeling, curve extrapolation

Procedia PDF Downloads 516
2603 Body Shape Control of Magnetic Soft Continuum Robots with PID Controller

Authors: M. H. Korayem, N. Sangsefidi

Abstract:

Magnetically guided soft robots have emerged as a promising technology in minimally invasive surgery due to their ability to adapt to complex environments. However, one of the main challenges in this field is damage to the vascular structure caused by unwanted stress on the vessel wall and deformation of the vessel due to improper control of the shape of the robot body during surgery. Therefore, this article proposes an approach for controlling the form of a magnetic, soft, continuous robot body using a PID controller. The magnetic soft continuous robot is modelled using Cosserat theory in static mode and solved numerically. The designed controller adjusts the position of each part of the robot to match the desired shape. The PID controller is considered to minimize the robot's contact with the vessel wall and prevent unwanted vessel deformation. The simulation results confirmed the accuracy of the numerical solution of the static Cosserat model. Also, they showed the effectiveness of the proposed contouring method in achieving the desired shape with a maximum error of about 0.3 millimetres.

Keywords: PID, magnetic soft continuous robot, soft robot shape control, Cosserat theory, minimally invasive surgery

Procedia PDF Downloads 83
2602 Use of SUDOKU Design to Assess the Implications of the Block Size and Testing Order on Efficiency and Precision of Dulce De Leche Preference Estimation

Authors: Jéssica Ferreira Rodrigues, Júlio Silvio De Sousa Bueno Filho, Vanessa Rios De Souza, Ana Carla Marques Pinheiro

Abstract:

This study aimed to evaluate the implications of the block size and testing order on efficiency and precision of preference estimation for Dulce de leche samples. Efficiency was defined as the inverse of the average variance of pairwise comparisons among treatments. Precision was defined as the inverse of the variance of treatment means (or effects) estimates. The experiment was originally designed to test 16 treatments as a series of 8 Sudoku 16x16 designs being 4 randomized independently and 4 others in the reverse order, to yield balance in testing order. Linear mixed models were assigned to the whole experiment with 112 testers and all their grades, as well as their partially balanced subgroups, namely: a) experiment with the four initial EU; b) experiment with EU 5 to 8; c) experiment with EU 9 to 12; and b) experiment with EU 13 to 16. To record responses we used a nine-point hedonic scale, it was assumed a mixed linear model analysis with random tester and treatments effects and with fixed test order effect. Analysis of a cumulative random effects probit link model was very similar, with essentially no different conclusions and for simplicity, we present the results using Gaussian assumption. R-CRAN library lme4 and its function lmer (Fit Linear Mixed-Effects Models) was used for the mixed models and libraries Bayesthresh (default Gaussian threshold function) and ordinal with the function clmm (Cumulative Link Mixed Model) was used to check Bayesian analysis of threshold models and cumulative link probit models. It was noted that the number of samples tested in the same session can influence the acceptance level, underestimating the acceptance. However, proving a large number of samples can help to improve the samples discrimination.

Keywords: acceptance, block size, mixed linear model, testing order, testing order

Procedia PDF Downloads 312
2601 Modelling a Hospital as a Queueing Network: Analysis for Improving Performance

Authors: Emad Alenany, M. Adel El-Baz

Abstract:

In this paper, the flow of different classes of patients into a hospital is modelled and analyzed by using the queueing network analyzer (QNA) algorithm and discrete event simulation. Input data for QNA are the rate and variability parameters of the arrival and service times in addition to the number of servers in each facility. Patient flows mostly match real flow for a hospital in Egypt. Based on the analysis of the waiting times, two approaches are suggested for improving performance: Separating patients into service groups, and adopting different service policies for sequencing patients through hospital units. The separation of a specific group of patients, with higher performance target, to be served separately from the rest of patients requiring lower performance target, requires the same capacity while improves performance for the selected group of patients with higher target. Besides, it is shown that adopting the shortest processing time and shortest remaining processing time service policies among other tested policies would results in, respectively, 11.47% and 13.75% reduction in average waiting time relative to first come first served policy.

Keywords: queueing network, discrete-event simulation, health applications, SPT

Procedia PDF Downloads 176
2600 Establishment of a Classifier Model for Early Prediction of Acute Delirium in Adult Intensive Care Unit Using Machine Learning

Authors: Pei Yi Lin

Abstract:

Objective: The objective of this study is to use machine learning methods to build an early prediction classifier model for acute delirium to improve the quality of medical care for intensive care patients. Background: Delirium is a common acute and sudden disturbance of consciousness in critically ill patients. After the occurrence, it is easy to prolong the length of hospital stay and increase medical costs and mortality. In 2021, the incidence of delirium in the intensive care unit of internal medicine was as high as 59.78%, which indirectly prolonged the average length of hospital stay by 8.28 days, and the mortality rate is about 2.22% in the past three years. Therefore, it is expected to build a delirium prediction classifier through big data analysis and machine learning methods to detect delirium early. Method: This study is a retrospective study, using the artificial intelligence big data database to extract the characteristic factors related to delirium in intensive care unit patients and let the machine learn. The study included patients aged over 20 years old who were admitted to the intensive care unit between May 1, 2022, and December 31, 2022, excluding GCS assessment <4 points, admission to ICU for less than 24 hours, and CAM-ICU evaluation. The CAMICU delirium assessment results every 8 hours within 30 days of hospitalization are regarded as an event, and the cumulative data from ICU admission to the prediction time point are extracted to predict the possibility of delirium occurring in the next 8 hours, and collect a total of 63,754 research case data, extract 12 feature selections to train the model, including age, sex, average ICU stay hours, visual and auditory abnormalities, RASS assessment score, APACHE-II Score score, number of invasive catheters indwelling, restraint and sedative and hypnotic drugs. Through feature data cleaning, processing and KNN interpolation method supplementation, a total of 54595 research case events were extracted to provide machine learning model analysis, using the research events from May 01 to November 30, 2022, as the model training data, 80% of which is the training set for model training, and 20% for the internal verification of the verification set, and then from December 01 to December 2022 The CU research event on the 31st is an external verification set data, and finally the model inference and performance evaluation are performed, and then the model has trained again by adjusting the model parameters. Results: In this study, XG Boost, Random Forest, Logistic Regression, and Decision Tree were used to analyze and compare four machine learning models. The average accuracy rate of internal verification was highest in Random Forest (AUC=0.86), and the average accuracy rate of external verification was in Random Forest and XG Boost was the highest, AUC was 0.86, and the average accuracy of cross-validation was the highest in Random Forest (ACC=0.77). Conclusion: Clinically, medical staff usually conduct CAM-ICU assessments at the bedside of critically ill patients in clinical practice, but there is a lack of machine learning classification methods to assist ICU patients in real-time assessment, resulting in the inability to provide more objective and continuous monitoring data to assist Clinical staff can more accurately identify and predict the occurrence of delirium in patients. It is hoped that the development and construction of predictive models through machine learning can predict delirium early and immediately, make clinical decisions at the best time, and cooperate with PADIS delirium care measures to provide individualized non-drug interventional care measures to maintain patient safety, and then Improve the quality of care.

Keywords: critically ill patients, machine learning methods, delirium prediction, classifier model

Procedia PDF Downloads 56
2599 The Association between Acupuncture Treatment and a Decreased Risk of Irritable Bowel Syndrome in Patients with Depression

Authors: Greg Zimmerman

Abstract:

Background: Major depression is a common illness that affects millions of people globally. It is the leading cause of disability and is projected to become the number one cause of the global burden of disease by 2030. Many of those who suffer from depression also suffer from Irritable Bowel Syndrome (IBS). Acupuncture has been shown to help depression. The aim of this study was to investigate the effectiveness of acupuncture in reducing the risk of IBS in patients with depression. Methods: We enrolled patients diagnosed with depression through the Taiwanese National Health Insurance Research Database (NHIRD). Propensity score matching was used to match equal numbers (n=32971) of the acupuncture cohort and no-acupuncture cohort based on characteristics including sex, age, baseline comorbidity, and medication. The Cox regression model was used to compare the hazard ratios (HRs) of IBS in the two cohorts. Results: The basic characteristics of the two groups were similar. The cumulative incidence of IBS was significantly lower in the acupuncture cohort than in the no-acupuncture cohort (Log-rank test, p<0.001). Conclusion: The results provided real-world evidence that acupuncture may have a beneficial effect on IBS risk reduction in patients with depression.

Keywords: acupuncture, depression, irritable bowel syndrome, national health insurance research database, real-world evidence

Procedia PDF Downloads 97
2598 Multi-Criteria Decision Approach to Performance Measurement Techniques Data Envelopment Analysis: Case Study of Kerman City’s Parks

Authors: Ali A. Abdollahi

Abstract:

During the last several decades, scientists have consistently applied Multiple Criteria Decision-Making methods in making decisions about multi-faceted, complicated subjects. While making such decisions and in order to achieve more accurate evaluations, they have regularly used a variety of criteria instead of applying just one Optimum Evaluation Criterion. The method presented here utilizes both ‘quantity’ and ‘quality’ to assess the function of the Multiple-Criteria method. Applying Data envelopment analysis (DEA), weighted aggregated sum product assessment (WASPAS), Weighted Sum Approach (WSA), Analytic Network Process (ANP), and Charnes, Cooper, Rhodes (CCR) methods, we have analyzed thirteen parks in Kerman city. It further indicates that the functions of WASPAS and WSA are compatible with each other, but also that their deviation from DEA is extensive. Finally, the results for the CCR technique do not match the results of the DEA technique. Our study indicates that the ANP method, with the average rate of 1/51, ranks closest to the DEA method, which has an average rate of 1/49.

Keywords: multiple criteria decision making, Data envelopment analysis (DEA), Charnes Cooper Rhodes (CCR), Weighted Sum Approach (WSA)

Procedia PDF Downloads 199
2597 Evaluation of External Costs of Traffic Accident in Slovak Republic

Authors: Anna Dolinayova, Jozef Danis, Juraj Camaj

Abstract:

The report deals with comparison of traffic accidents in Slovak republic in road and rail transport since year 2009 until 2014, with evaluation of external costs and consequently with the possibilities of their internalization. The results of road traffic accidents analysis are realized in line with after-effects they have caused; in line with main cause, place of origin (within or out of town) and in accordance to age of people they were killed or hard, eventually easy injured in traffic accidents. Evaluation of individual after-effects is carried in terms of probability of traffic accidents occurrence.

Keywords: external costs, traffic accident, rail transport, road transport

Procedia PDF Downloads 573
2596 Machine Learning Approach for Predicting Students’ Academic Performance and Study Strategies Based on Their Motivation

Authors: Fidelia A. Orji, Julita Vassileva

Abstract:

This research aims to develop machine learning models for students' academic performance and study strategy prediction, which could be generalized to all courses in higher education. Key learning attributes (intrinsic, extrinsic, autonomy, relatedness, competence, and self-esteem) used in building the models are chosen based on prior studies, which revealed that the attributes are essential in students’ learning process. Previous studies revealed the individual effects of each of these attributes on students’ learning progress. However, few studies have investigated the combined effect of the attributes in predicting student study strategy and academic performance to reduce the dropout rate. To bridge this gap, we used Scikit-learn in python to build five machine learning models (Decision Tree, K-Nearest Neighbour, Random Forest, Linear/Logistic Regression, and Support Vector Machine) for both regression and classification tasks to perform our analysis. The models were trained, evaluated, and tested for accuracy using 924 university dentistry students' data collected by Chilean authors through quantitative research design. A comparative analysis of the models revealed that the tree-based models such as the random forest (with prediction accuracy of 94.9%) and decision tree show the best results compared to the linear, support vector, and k-nearest neighbours. The models built in this research can be used in predicting student performance and study strategy so that appropriate interventions could be implemented to improve student learning progress. Thus, incorporating strategies that could improve diverse student learning attributes in the design of online educational systems may increase the likelihood of students continuing with their learning tasks as required. Moreover, the results show that the attributes could be modelled together and used to adapt/personalize the learning process.

Keywords: classification models, learning strategy, predictive modeling, regression models, student academic performance, student motivation, supervised machine learning

Procedia PDF Downloads 111
2595 Clusterization Probability in 14N Nuclei

Authors: N. Burtebayev, Sh. Hamada, Zh. Kerimkulov, D. K. Alimov, A. V. Yushkov, N. Amangeldi, A. N. Bakhtibaev

Abstract:

The main aim of the current work is to examine if 14N is candidate to be clusterized nuclei or not. In order to check this attendance, we have measured the angular distributions for 14N ion beam elastically scattered on 12C target nuclei at different low energies; 17.5, 21, and 24.5MeV which are close to the Coulomb barrier energy for 14N+12C nuclear system. Study of various transfer reactions could provide us with useful information about the attendance of nuclei to be in a composite form (core + valence). The experimental data were analyzed using two approaches; Phenomenological (Optical Potential) and semi-microscopic (Double Folding Potential). The agreement between the experimental data and the theoretical predictions is fairly good in the whole angular range.

Keywords: deuteron transfer, elastic scattering, optical model, double folding, density distribution

Procedia PDF Downloads 321
2594 Underrepresentation of Right Middle Cerebral Infarct: A Statistical Parametric Mapping

Authors: Wi-Sun Ryu, Eun-Kee Bae

Abstract:

Prior studies have shown that patients with right hemispheric stroke are likely to seek medical service compared with those with left hemispheric stroke. However, the underlying mechanism for this phenomenon is unknown. In the present study, we generated lesion probability maps in a patient with right and left middle cerebral artery infarct and statistically compared. We found that precentral gyrus-Brodmann area 44, a language area in the left hemisphere - involvement was significantly higher in patients with left hemispheric stroke. This finding suggests that a language dysfunction was more noticeable, thereby taking more patients to hospitals.

Keywords: cerebral infarct, brain MRI, statistical parametric mapping, middle cerebral infarct

Procedia PDF Downloads 326
2593 New Two-Way Map-Reduce Join Algorithm: Hash Semi Join

Authors: Marwa Hussein Mohamed, Mohamed Helmy Khafagy, Samah Ahmed Senbel

Abstract:

Map Reduce is a programming model used to handle and support massive data sets. Rapidly increasing in data size and big data are the most important issue today to make an analysis of this data. map reduce is used to analyze data and get more helpful information by using two simple functions map and reduce it's only written by the programmer, and it includes load balancing , fault tolerance and high scalability. The most important operation in data analysis are join, but map reduce is not directly support join. This paper explains two-way map-reduce join algorithm, semi-join and per split semi-join, and proposes new algorithm hash semi-join that used hash table to increase performance by eliminating unused records as early as possible and apply join using hash table rather than using map function to match join key with other data table in the second phase but using hash tables isn't affecting on memory size because we only save matched records from the second table only. Our experimental result shows that using a hash table with hash semi-join algorithm has higher performance than two other algorithms while increasing the data size from 10 million records to 500 million and running time are increased according to the size of joined records between two tables.

Keywords: map reduce, hadoop, semi join, two way join

Procedia PDF Downloads 501
2592 The Influence of Fashion Bloggers on the Pre-Purchase Decision for Online Fashion Products among Generation Y Female Malaysian Consumers

Authors: Mohd Zaimmudin Mohd Zain, Patsy Perry, Lee Quinn

Abstract:

This study explores how fashion consumers are influenced by fashion bloggers towards pre-purchase decision for online fashion products in a non-Western context. Malaysians rank among the world’s most avid online shoppers, with apparel the third most popular purchase category. However, extant research on fashion blogging focuses on the developed Western market context. Numerous international fashion retailers have entered the Malaysian market from luxury to fast fashion segments of the market; however Malaysian fashion consumers must balance religious and social norms for modesty with their dress style and adoption of fashion trends. Consumers increasingly mix and match Islamic and Western elements of dress to create new styles enabling them to follow Western fashion trends whilst paying respect to social and religious norms. Social media have revolutionised the way that consumers can search for and find information about fashion products. For online fashion brands with no physical presence, social media provide a means of discovery for consumers. By allowing the creation and exchange of user-generated content (UGC) online, they provide a public forum that gives individual consumers their own voices, as well as access to product information that facilitates their purchase decisions. Social media empower consumers and brands have important roles in facilitating conversations among consumers and themselves, to help consumers connect with them and one another. Fashion blogs have become an important fashion information sources. By sharing their personal style and inspiring their followers with what they wear on popular social media platforms such as Instagram, fashion bloggers have become fashion opinion leaders. By creating UGC to spread useful information to their followers, they influence the pre-purchase decision. Hence, successful Western fashion bloggers such as Chiara Ferragni may earn millions of US dollars every year, and some have created their own fashion ranges and beauty products, become judges in fashion reality shows, won awards, and collaborated with high street and luxury brands. As fashion blogging has become more established worldwide, increasing numbers of fashion bloggers have emerged from non-Western backgrounds to promote Islamic fashion styles, such as Hassanah El-Yacoubi and Dian Pelangi. This study adopts a qualitative approach using netnographic content analysis of consumer comments on two famous Malaysian fashion bloggers’ Instagram accounts during January-March 2016 and qualitative interviews with 16 Malaysian Generation Y fashion consumers during September-October 2016. Netnography adapts ethnographic techniques to the study of online communities or computer-mediated communications. Template analysis of the data involved coding comments according to the theoretical framework, which was developed from the literature review. Initial data analysis shows the strong influence of Malaysian fashion bloggers on their followers in terms of lifestyle and morals as well as fashion style. Followers were guided towards the mix and match trend of dress with Western and Islamic elements, for example, showing how vivid colours or accessories could be worked into an outfit whilst still respecting social and religious norms. The blogger’s Instagram account is a form of online community where followers can communicate and gain guidance and support from other followers, as well as from the blogger.

Keywords: fashion bloggers, Malaysia, qualitative, social media

Procedia PDF Downloads 207
2591 Accuracy Improvement of Traffic Participant Classification Using Millimeter-Wave Radar by Leveraging Simulator Based on Domain Adaptation

Authors: Tokihiko Akita, Seiichi Mita

Abstract:

A millimeter-wave radar is the most robust against adverse environments, making it an essential environment recognition sensor for automated driving. However, the reflection signal is sparse and unstable, so it is difficult to obtain the high recognition accuracy. Deep learning provides high accuracy even for them in recognition, but requires large scale datasets with ground truth. Specially, it takes a lot of cost to annotate for a millimeter-wave radar. For the solution, utilizing a simulator that can generate an annotated huge dataset is effective. Simulation of the radar is more difficult to match with real world data than camera image, and recognition by deep learning with higher-order features using the simulator causes further deviation. We have challenged to improve the accuracy of traffic participant classification by fusing simulator and real-world data with domain adaptation technique. Experimental results with the domain adaptation network created by us show that classification accuracy can be improved even with a few real-world data.

Keywords: millimeter-wave radar, object classification, deep learning, simulation, domain adaptation

Procedia PDF Downloads 81
2590 Assertion-Driven Test Repair Based on Priority Criteria

Authors: Ruilian Zhao, Shukai Zhang, Yan Wang, Weiwei Wang

Abstract:

Repairing broken test cases is an expensive and challenging task in evolving software systems. Although an automated repair technique with intent preservation has been proposed, but it does not take into account the association between test repairs and assertions, leading to a large number of irrelevant candidates and decreasing the repair capability. This paper proposes an assertion-driven test repair approach. Furthermore, an intent-oriented priority criterion is raised to guide the repair candidate generation, making the repairs closer to the intent of the test. In more detail, repair targets are determined through post-dominance relations between assertions and the methods that directly cause compilation errors. Then, test repairs are generated from the target in a bottom-up way, guided by the intent-oriented priority criteria. Finally, the generated repair candidates are prioritized to match the original test intent. The approach is implemented and evaluated on the benchmark of 4 open-source programs and 91 broken test cases. The result shows that the approach can fix 89% (81/91) of broken test cases, which is more effective than the existing intentpreserved test repair approach, and our intent-oriented priority criteria work well.

Keywords: test repair, test intent, software test, test case evolution

Procedia PDF Downloads 115
2589 Non-Governmental Organisations and Human Development in Bauchi State, Nigeria

Authors: Sadeeq Launi

Abstract:

NGOs, the world over, have been recognized as part of the institutions that complement government activities in providing services to the people, particularly in respect of human development. This study examined the role played by the NGOs in human development in Bauchi State, Nigeria, between 2004 and 2013. The emphasis was on reproductive health and access to education role of the selected NGOs. All the research questions, objectives and hypotheses were stated in line with these variables. The theoretical framework that guided the study was the participatory development approach. Being a survey research, data were generated from both primary and secondary sources with questionnaires and interviews as the instruments for generating the primary data. The population of the study was made up of the staff of the selected NGOs, beneficiaries, health staff and school teachers in Bauchi State. The sample drawn from these categories were 90, 107 and 148 units respectively. Stratified random and simple random sampling techniques were adopted for NGOs staff, and Health staff and school teachers data were analyzed quantitatively and qualitatively and hypotheses were tested using Pearson Chi-square test through SPSS computer statistical package. The study revealed that despite the challenges facing NGOs operations in the study area, NGOs rendered services in the areas of health and education This research recommends among others that, both government and people should be more cooperative to NGOs to enable them provide more efficient and effective services. Governments at all levels should be more dedicated to increasing accessibility and affordability of basic education and reproductive health care facilities and services in Bauchi state through committing more resources to the Health and Education sectors, this would support and facilitate the complementary role of NGOs in providing teaching facilities, drugs, and other reproductive health services in the States. More enlightenment campaigns should be carried out by governments to sensitize the public, particularly women on the need to embrace immunization programmes for their children and antenatal care services being provided by both the government and NGOs.

Keywords: access to education, human development, NGOs, reproductive health

Procedia PDF Downloads 164
2588 Landslide Susceptibility Mapping Using Soft Computing in Amhara Saint

Authors: Semachew M. Kassa, Africa M Geremew, Tezera F. Azmatch, Nandyala Darga Kumar

Abstract:

Frequency ratio (FR) and analytical hierarchy process (AHP) methods are developed based on past landslide failure points to identify the landslide susceptibility mapping because landslides can seriously harm both the environment and society. However, it is still difficult to select the most efficient method and correctly identify the main driving factors for particular regions. In this study, we used fourteen landslide conditioning factors (LCFs) and five soft computing algorithms, including Random Forest (RF), Support Vector Machine (SVM), Logistic Regression (LR), Artificial Neural Network (ANN), and Naïve Bayes (NB), to predict the landslide susceptibility at 12.5 m spatial scale. The performance of the RF (F1-score: 0.88, AUC: 0.94), ANN (F1-score: 0.85, AUC: 0.92), and SVM (F1-score: 0.82, AUC: 0.86) methods was significantly better than the LR (F1-score: 0.75, AUC: 0.76) and NB (F1-score: 0.73, AUC: 0.75) method, according to the classification results based on inventory landslide points. The findings also showed that around 35% of the study region was made up of places with high and very high landslide risk (susceptibility greater than 0.5). The very high-risk locations were primarily found in the western and southeastern regions, and all five models showed good agreement and similar geographic distribution patterns in landslide susceptibility. The towns with the highest landslide risk include Amhara Saint Town's western part, the Northern part, and St. Gebreal Church villages, with mean susceptibility values greater than 0.5. However, rainfall, distance to road, and slope were typically among the top leading factors for most villages. The primary contributing factors to landslide vulnerability were slightly varied for the five models. Decision-makers and policy planners can use the information from our study to make informed decisions and establish policies. It also suggests that various places should take different safeguards to reduce or prevent serious damage from landslide events.

Keywords: artificial neural network, logistic regression, landslide susceptibility, naïve Bayes, random forest, support vector machine

Procedia PDF Downloads 59
2587 Automated Ultrasound Carotid Artery Image Segmentation Using Curvelet Threshold Decomposition

Authors: Latha Subbiah, Dhanalakshmi Samiappan

Abstract:

In this paper, we propose denoising Common Carotid Artery (CCA) B mode ultrasound images by a decomposition approach to curvelet thresholding and automatic segmentation of the intima media thickness and adventitia boundary. By decomposition, the local geometry of the image, its direction of gradients are well preserved. The components are combined into a single vector valued function, thus removes noise patches. Double threshold is applied to inherently remove speckle noise in the image. The denoised image is segmented by active contour without specifying seed points. Combined with level set theory, they provide sub regions with continuous boundaries. The deformable contours match to the shapes and motion of objects in the images. A curve or a surface under constraints is developed from the image with the goal that it is pulled into the necessary features of the image. Region based and boundary based information are integrated to achieve the contour. The method treats the multiplicative speckle noise in objective and subjective quality measurements and thus leads to better-segmented results. The proposed denoising method gives better performance metrics compared with other state of art denoising algorithms.

Keywords: curvelet, decomposition, levelset, ultrasound

Procedia PDF Downloads 329
2586 Pion/Muon Identification in a Nuclear Emulsion Cloud Chamber Using Neural Networks

Authors: Kais Manai

Abstract:

The main part of this work focuses on the study of pion/muon separation at low energy using a nuclear Emulsion Cloud Chamber (ECC) made of lead and nuclear emulsion films. The work consists of two parts: particle reconstruction algorithm and a Neural Network that assigns to each reconstructed particle the probability to be a muon or a pion. The pion/muon separation algorithm has been optimized by using a detailed Monte Carlo simulation of the ECC and tested on real data. The algorithm allows to achieve a 60% muon identification efficiency with a pion misidentification smaller than 3%.

Keywords: nuclear emulsion, particle identification, tracking, neural network

Procedia PDF Downloads 488
2585 Collocation Method for Coupled System of Boundary Value Problems with Cubic B-Splines

Authors: K. N. S. Kasi Viswanadham

Abstract:

Coupled system of second order linear and nonlinear boundary value problems occur in various fields of Science and Engineering. In the formulation of the problem, any one of 81 possible types of boundary conditions may occur. These 81 possible boundary conditions are written as a combination of four boundary conditions. To solve a coupled system of boundary value problem with these converted boundary conditions, a collocation method with cubic B-splines as basis functions has been developed. In the collocation method, the mesh points of the space variable domain have been selected as the collocation points. The basis functions have been redefined into a new set of basis functions which in number match with the number of mesh points in the space variable domain. The solution of a non-linear boundary value problem has been obtained as the limit of a sequence of solutions of linear boundary value problems generated by quasilinearization technique. Several linear and nonlinear boundary value problems are presented to test the efficiency of the proposed method and found that numerical results obtained by the present method are in good agreement with the exact solutions available in the literature.

Keywords: collocation method, coupled system, cubic b-splines, mesh points

Procedia PDF Downloads 201
2584 The Search of Possibility of Running Six Sigma Process in It Education Center

Authors: Mohammad Amini, Aliakbar Alijarahi

Abstract:

This research that is collected and title as ‘ the search of possibility of running six sigma process in IT education center ‘ goals to test possibility of running the six sigma process and using in IT education center system. This process is a good method that is used for reducing process, errors. To evaluate running off six sigma in the IT education center, some variables relevant to this process is selected. These variables are: - The amount of support from organization master boss to process. - The current specialty. - The ability of training system for compensating reduction. - The amount of match between current culture whit six sigma culture . - The amount of current quality by comparing whit quality gain from running six sigma. For evaluation these variables we select four question and to gain the answers, we set a questionnaire from with 28 question and distribute it in our typical society. Since, our working environment is a very competition, and organization needs to decree the errors to minimum, otherwise it lasts their customers. The questionnaire from is given to 55 persons, they were filled and returned by 50 persons, after analyzing the forms these results is gained: - IT education center needs to use and run this system (six sigma) for improving their process qualities. - The most factors need to run the six sigma exist in the IT education center, but there is a need to support.

Keywords: education, customer, self-action, quality, continuous improvement process

Procedia PDF Downloads 331
2583 The Employment of Unmanned Aircraft Systems for Identification and Classification of Helicopter Landing Zones and Airdrop Zones in Calamity Situations

Authors: Marielcio Lacerda, Angelo Paulino, Elcio Shiguemori, Alvaro Damiao, Lamartine Guimaraes, Camila Anjos

Abstract:

Accurate information about the terrain is extremely important in disaster management activities or conflict. This paper proposes the use of the Unmanned Aircraft Systems (UAS) at the identification of Airdrop Zones (AZs) and Helicopter Landing Zones (HLZs). In this paper we consider the AZs the zones where troops or supplies are dropped by parachute, and HLZs areas where victims can be rescued. The use of digital image processing enables the automatic generation of an orthorectified mosaic and an actual Digital Surface Model (DSM). This methodology allows obtaining this fundamental information to the terrain’s comprehension post-disaster in a short amount of time and with good accuracy. In order to get the identification and classification of AZs and HLZs images from DJI drone, model Phantom 4 have been used. The images were obtained with the knowledge and authorization of the responsible sectors and were duly registered in the control agencies. The flight was performed on May 24, 2017, and approximately 1,300 images were obtained during approximately 1 hour of flight. Afterward, new attributes were generated by Feature Extraction (FE) from the original images. The use of multispectral images and complementary attributes generated independently from them increases the accuracy of classification. The attributes of this work include the Declivity Map and Principal Component Analysis (PCA). For the classification four distinct classes were considered: HLZ 1 – small size (18m x 18m); HLZ 2 – medium size (23m x 23m); HLZ 3 – large size (28m x 28m); AZ (100m x 100m). The Decision Tree method Random Forest (RF) was used in this work. RF is a classification method that uses a large collection of de-correlated decision trees. Different random sets of samples are used as sampled objects. The results of classification from each tree and for each object is called a class vote. The resulting classification is decided by a majority of class votes. In this case, we used 200 trees for the execution of RF in the software WEKA 3.8. The classification result was visualized on QGIS Desktop 2.12.3. Through the methodology used, it was possible to classify in the study area: 6 areas as HLZ 1, 6 areas as HLZ 2, 4 areas as HLZ 3; and 2 areas as AZ. It should be noted that an area classified as AZ covers the classifications of the other classes, and may be used as AZ, HLZ of large size (HLZ3), medium size (HLZ2) and small size helicopters (HLZ1). Likewise, an area classified as HLZ for large rotary wing aircraft (HLZ3) covers the smaller area classifications, and so on. It was concluded that images obtained through small UAV are of great use in calamity situations since they can provide data with high accuracy, with low cost, low risk and ease and agility in obtaining aerial photographs. This allows the generation, in a short time, of information about the features of the terrain in order to serve as an important decision support tool.

Keywords: disaster management, unmanned aircraft systems, helicopter landing zones, airdrop zones, random forest

Procedia PDF Downloads 165
2582 A Comparison of the First Language Vocabulary Used by Indonesian Year 4 Students and the Vocabulary Taught to Them in English Language Textbooks

Authors: Fitria Ningsih

Abstract:

This study concerns on the process of making corpus obtained from Indonesian year 4 students’ free writing compared to the vocabulary taught in English language textbooks. 369 students’ sample writings from 19 public elementary schools in Malang, East Java, Indonesia and 5 selected English textbooks were analyzed through corpus in linguistics method using AdTAT -the Adelaide Text Analysis Tool- program. The findings produced wordlists of the top 100 words most frequently used by students and the top 100 words given in English textbooks. There was a 45% match between the two lists. Furthermore, the classifications of the top 100 most frequent words from the two corpora based on part of speech found that both the Indonesian and English languages employed a similar use of nouns, verbs, adjectives, and prepositions. Moreover, to see the contextualizing the vocabulary of learning materials towards the students’ need, a depth-analysis dealing with the content and the cultural views from the vocabulary taught in the textbooks was discussed through the criteria developed from the checklist. Lastly, further suggestions are addressed to language teachers to understand the students’ background such as recognizing the basic words students acquire before teaching them new vocabulary in order to achieve successful learning of the target language.

Keywords: corpus, frequency, English, Indonesian, linguistics, textbooks, vocabulary, wordlists, writing

Procedia PDF Downloads 173
2581 Decision Tree Modeling in Emergency Logistics Planning

Authors: Yousef Abu Nahleh, Arun Kumar, Fugen Daver, Reham Al-Hindawi

Abstract:

Despite the availability of natural disaster related time series data for last 110 years, there is no forecasting tool available to humanitarian relief organizations to determine forecasts for emergency logistics planning. This study develops a forecasting tool based on identifying probability of disaster for each country in the world by using decision tree modeling. Further, the determination of aggregate forecasts leads to efficient pre-disaster planning. Based on the research findings, the relief agencies can optimize the various resources allocation in emergency logistics planning.

Keywords: decision tree modeling, forecasting, humanitarian relief, emergency supply chain

Procedia PDF Downloads 469
2580 An Optimal Control Model to Determine Body Forces of Stokes Flow

Authors: Yuanhao Gao, Pin Lin, Kees Weijer

Abstract:

In this paper, we will determine the external body force distribution with analysis of stokes fluid motion using mathematical modelling and numerical approaching. The body force distribution is regarded as the unknown variable and could be determined by the idea of optimal control theory. The Stokes flow motion and its velocity are generated by given forces in a unit square domain. A regularized objective functional is built to match the numerical result of flow velocity with the generated velocity data. So that the force distribution could be determined by minimizing the value of objective functional, which is also the difference between the numerical and experimental velocity. Then after utilizing the Lagrange multiplier method, some partial differential equations are formulated consisting the optimal control system to solve. Finite element method and conjugate gradient method are used to discretize equations and deduce the iterative expression of target body force to compute the velocity numerically and body force distribution. Programming environment FreeFEM++ supports the implementation of this model.

Keywords: optimal control model, Stokes equation, finite element method, conjugate gradient method

Procedia PDF Downloads 388
2579 Optimization of Syngas Quality for Fischer-Tropsch Synthesis

Authors: Ali Rabah

Abstract:

This research received no grant or financial support from any public, commercial, or none governmental agency. The author conducted this work as part of his normal research activities as a professor of Chemical Engineering at the University of Khartoum, Sudan. Abstract While fossil oil reserves have been receding, the demand for diesel and gasoline has been growing. In recent years, syngas of biomass origin has been emerging as a viable feedstock for Fischer-Tropsch (FT) synthesis, a process for manufacturing synthetic gasoline and diesel. This paper reports the optimization of syngas quality to match FT synthesis requirements. The optimization model maximizes the thermal efficiency under the constraint of H2/CO≥2.0 and operating conditions of equivalent ratio (0 ≤ ER ≤ 1.0), steam to biomass ratio (0 ≤ SB ≤ 5), and gasification temperature (500 °C ≤ Tg ≤ 1300 °C). The optimization model is executed using the optimization section of the Model Analysis Tools of the Aspen Plus simulator. The model is tested using eleven (11) types of MSW. The optimum operating conditions under which the objective function and the constraint are satisfied are ER=0, SB=0.66-1.22, and Tg=679 - 763°C. Under the optimum operating conditions, the syngas quality is H2=52.38 - 58.67-mole percent, LHV=12.55 - 17.15 MJ/kg, N2=0.38 - 2.33-mole percent, and H2/CO≥2.15. The generalized optimization model reported could be extended to any other type of biomass and coal. Keywords: MSW, Syngas, Optimization, Fischer-Tropsch.

Keywords: syngas, MSW, optimization, Fisher-Tropsh

Procedia PDF Downloads 63
2578 Developing High-Definition Flood Inundation Maps (HD-Fims) Using Raster Adjustment with Scenario Profiles (RASPTM)

Authors: Robert Jacobsen

Abstract:

Flood inundation maps (FIMs) are an essential tool in communicating flood threat scenarios to the public as well as in floodplain governance. With an increasing demand for online raster FIMs, the FIM State-of-the-Practice (SOP) is rapidly advancing to meet the dual requirements for high-resolution and high-accuracy—or High-Definition. Importantly, today’s technology also enables the resolution of problems of local—neighborhood-scale—bias errors that often occur in FIMs, even with the use of SOP two-dimensional flood modeling. To facilitate the development of HD-FIMs, a new GIS method--Raster Adjustment with Scenario Profiles, RASPTM—is described for adjusting kernel raster FIMs to match refined scenario profiles. With RASPTM, flood professionals can prepare HD-FIMs for a wide range of scenarios with available kernel rasters, including kernel rasters prepared from vector FIMs. The paper provides detailed procedures for RASPTM, along with an example of applying RASPTM to prepare an HD-FIM for the August 2016 Flood in Louisiana using both an SOP kernel raster and a kernel raster derived from an older vector-based flood insurance rate map. The accuracy of the HD-FIMs achieved with the application of RASPTM to the two kernel rasters is evaluated.

Keywords: hydrology, mapping, high-definition, inundation

Procedia PDF Downloads 62
2577 Risk Identification of Investment Feasibility in Indonesia’s Toll Road Infrastructure Investment

Authors: Christo Februanto Putra

Abstract:

This paper presents risk identification that affects investment feasibility on toll road infrastructure in Indonesia using qualitative methods survey based on the expert practitioner in investor, contractor, and state officials. The problems on infrastructure investment in Indonesia, especially on KPBU model contract, is many risk factors in the investment plan is not calculated in detail thoroughly. Risk factor is a value used to provide an overview of the risk level assessment of an event which is a function of the probability of the occurrence and the consequences of the risks that arise. As results of the survey which is to show which risk factors impacts directly to the investment feasibility and rank them by their impacts on the investment.

Keywords: risk identification, indonesia toll road, investment feasibility

Procedia PDF Downloads 266
2576 Structure and Mechanics Patterns in the Assembly of Type V Intermediate-Filament Protein-Based Fibers

Authors: Mark Bezner, Shani Deri, Tom Trigano, Kfir Ben-Harush

Abstract:

Intermediate filament (IF) proteins-based fibers are among the toughest fibers in nature, as was shown by native hagfish slime threads and by synthetic fibers that are based on type V IF-proteins, the nuclear lamins. It is assumed that their mechanical performance stems from two major factors: (1) the transition from elastic -helices to stiff-sheets during tensile load; and (2) the specific organization of the coiled-coil proteins into a hierarchical network of nano-filaments. Here, we investigated the interrelationship between these two factors by using wet-spun fibers based on C. elegans (Ce) lamin. We found that Ce-lamin fibers, whether assembled in aqueous or alcoholic solutions, had the same nonlinear mechanical behavior, with the elastic region ending at ~5%. The pattern of the transition was, however, different: the ratio between -helices and -sheets/random coils was relatively constant until a 20% strain for fibers assembled in an aqueous solution, whereas for fibers assembled in 70% ethanol, the transition ended at a 6% strain. This structural phenomenon in alcoholic solution probably occurred through the transition between compacted and extended conformation of the random coil, and not between -helix and -sheets, as cycle analyses had suggested. The different transition pattern can also be explained by the different higher order organization of Ce-lamins in aqueous or alcoholic solutions, as demonstrated by introducing a point mutation in conserved residue in Ce-lamin gene that alter the structure of the Ce-lamins’ nano-fibrils. In addition, biomimicking the layered structure of silk and hair fibers by coating the Ce-lamin fiber with a hydrophobic layer enhanced fiber toughness and lead to a reversible transition between -helix and the extended conformation. This work suggests that different hierarchical structures, which are formed by specific assembly conditions, lead to diverse secondary structure transitions patterns, which in turn affect the fibers’ mechanical properties.

Keywords: protein-based fibers, intermediate filaments (IF) assembly, toughness, structure-property relationships

Procedia PDF Downloads 99