Search results for: biologically inspired algorithm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4215

Search results for: biologically inspired algorithm

645 Managing Crowds at Sports Mega Events: Examining the Impact of ‘Fan Parks’ at International Football Tournaments between 2002 and 2016

Authors: Joel Rookwood

Abstract:

Sports mega events have become increasingly significant in sporting, political and economic terms, with analysis often focusing on issues including resource expenditure, development, legacy and sustainability. Transnational tournaments can inspire interest from a variety of demographics, and the operational management of such events can involve contributions from a range of personnel. In addition to television audiences events also attract attending spectators, and in football contexts the temporary migration of fans from potentially rival nations and teams can present event organising committees and security personnel with various challenges in relation to crowd management. The behaviour, interaction and control of supporters has previously led to incidents of disorder and hooliganism, with damage to property as well as injuries and deaths proving significant consequences. The Heysel tragedy at the 1985 European Cup final in Brussels is a notable example, where 39 fans died following crowd disorder and mismanagement. Football disasters and disorder, particularly in the context of international competition, have inspired responses from police, law makers, event organisers, clubs and associations, including stadium improvements, legislative developments and crowd management practice to improve the effectiveness of spectator safety. The growth and internationalisation of fandom and developments in event management and tourism have seen various responses to the evolving challenges associated with hosting large numbers of visiting spectators at mega events. In football contexts ‘fan parks’ are a notable example. Since the first widespread introduction in European football competitions at the 2006 World Cup finals in Germany, these facilities have become a staple element of such mega events. This qualitative, longitudinal, multi-continent research draws on extensive semi-structured interview and observation data. As a frame of reference, this work considers football events staged before and after the development of fan parks. Research was undertaken at four World Cup finals (Japan 2002, Germany 2006, South Africa 2010 and Brazil 2014), four European Championships (Portugal 2004, Switzerland/Austria 2008, Poland/Ukraine 2012 and France 2016), four other confederation tournaments (Ghana 2008, Qatar 2011, USA 2011 and Chile 2015), and four European club finals (Istanbul 2005, Athens 2007, Rome 2009 and Basle 2016). This work found that these parks are typically temporarily erected, specifically located zones where supporters congregate together irrespective of allegiances to watch matches on large screens, and partake in other forms of organised on-site entertainment. Such facilities can also allow organisers to control the behaviour, confine the movement and monitor the alcohol consumption of supporters. This represents a notable shift in policy from previous football tournaments, when the widely assumed causal link between alcohol and hooliganism which frequently shaped legislative and police responses to disorder, also dissuaded some authorities from permitting fans to consume alcohol in and around stadia. It also reflects changing attitudes towards modern football fans. The work also found that in certain contexts supporters have increasingly engaged with such provision which impacts fan behaviour, but that this is relative to factors including location, facilities, management and security.

Keywords: event, facility, fan, management, park

Procedia PDF Downloads 313
644 Global Navigation Satellite System and Precise Point Positioning as Remote Sensing Tools for Monitoring Tropospheric Water Vapor

Authors: Panupong Makvichian

Abstract:

Global Navigation Satellite System (GNSS) is nowadays a common technology that improves navigation functions in our life. Additionally, GNSS is also being employed on behalf of an accurate atmospheric sensor these times. Meteorology is a practical application of GNSS, which is unnoticeable in the background of people’s life. GNSS Precise Point Positioning (PPP) is a positioning method that requires data from a single dual-frequency receiver and precise information about satellite positions and satellite clocks. In addition, careful attention to mitigate various error sources is required. All the above data are combined in a sophisticated mathematical algorithm. At this point, the research is going to demonstrate how GNSS and PPP method is capable to provide high-precision estimates, such as 3D positions or Zenith tropospheric delays (ZTDs). ZTDs combined with pressure and temperature information allows us to estimate the water vapor in the atmosphere as precipitable water vapor (PWV). If the process is replicated for a network of GNSS sensors, we can create thematic maps that allow extract water content information in any location within the network area. All of the above are possible thanks to the advances in GNSS data processing. Therefore, we are able to use GNSS data for climatic trend analysis and acquisition of the further knowledge about the atmospheric water content.

Keywords: GNSS, precise point positioning, Zenith tropospheric delays, precipitable water vapor

Procedia PDF Downloads 198
643 Finite Volume Method for Flow Prediction Using Unstructured Meshes

Authors: Juhee Lee, Yongjun Lee

Abstract:

In designing a low-energy-consuming buildings, the heat transfer through a large glass or wall becomes critical. Multiple layers of the window glasses and walls are employed for the high insulation. The gravity driven air flow between window glasses or wall layers is a natural heat convection phenomenon being a key of the heat transfer. For the first step of the natural heat transfer analysis, in this study the development and application of a finite volume method for the numerical computation of viscous incompressible flows is presented. It will become a part of the natural convection analysis with high-order scheme, multi-grid method, and dual-time step in the future. A finite volume method based on a fully-implicit second-order is used to discretize and solve the fluid flow on unstructured grids composed of arbitrary-shaped cells. The integrations of the governing equation are discretised in the finite volume manner using a collocated arrangement of variables. The convergence of the SIMPLE segregated algorithm for the solution of the coupled nonlinear algebraic equations is accelerated by using a sparse matrix solver such as BiCGSTAB. The method used in the present study is verified by applying it to some flows for which either the numerical solution is known or the solution can be obtained using another numerical technique available in the other researches. The accuracy of the method is assessed through the grid refinement.

Keywords: finite volume method, fluid flow, laminar flow, unstructured grid

Procedia PDF Downloads 286
642 Automatic Registration of Rail Profile Based Local Maximum Curvature Entropy

Authors: Hao Wang, Shengchun Wang, Weidong Wang

Abstract:

On the influence of train vibration and environmental noise on the measurement of track wear, we proposed a method for automatic extraction of circular arc on the inner or outer side of the rail waist and achieved the high-precision registration of rail profile. Firstly, a polynomial fitting method based on truncated residual histogram was proposed to find the optimal fitting curve of the profile and reduce the influence of noise on profile curve fitting. Then, based on the curvature distribution characteristics of the fitting curve, the interval search algorithm based on dynamic window’s maximum curvature entropy was proposed to realize the automatic segmentation of small circular arc. At last, we fit two circle centers as matching reference points based on small circular arcs on both sides and realized the alignment from the measured profile to the standard designed profile. The static experimental results show that the mean and standard deviation of the method are controlled within 0.01mm with small measurement errors and high repeatability. The dynamic test also verified the repeatability of the method in the train-running environment, and the dynamic measurement deviation of rail wear is within 0.2mm with high repeatability.

Keywords: curvature entropy, profile registration, rail wear, structured light, train-running

Procedia PDF Downloads 260
641 ANOVA-Based Feature Selection and Machine Learning System for IoT Anomaly Detection

Authors: Muhammad Ali

Abstract:

Cyber-attacks and anomaly detection on the Internet of Things (IoT) infrastructure is emerging concern in the domain of data-driven intrusion. Rapidly increasing IoT risk is now making headlines around the world. denial of service, malicious control, data type probing, malicious operation, DDos, scan, spying, and wrong setup are attacks and anomalies that can affect an IoT system failure. Everyone talks about cyber security, connectivity, smart devices, and real-time data extraction. IoT devices expose a wide variety of new cyber security attack vectors in network traffic. For further than IoT development, and mainly for smart and IoT applications, there is a necessity for intelligent processing and analysis of data. So, our approach is too secure. We train several machine learning models that have been compared to accurately predicting attacks and anomalies on IoT systems, considering IoT applications, with ANOVA-based feature selection with fewer prediction models to evaluate network traffic to help prevent IoT devices. The machine learning (ML) algorithms that have been used here are KNN, SVM, NB, D.T., and R.F., with the most satisfactory test accuracy with fast detection. The evaluation of ML metrics includes precision, recall, F1 score, FPR, NPV, G.M., MCC, and AUC & ROC. The Random Forest algorithm achieved the best results with less prediction time, with an accuracy of 99.98%.

Keywords: machine learning, analysis of variance, Internet of Thing, network security, intrusion detection

Procedia PDF Downloads 125
640 Identification of Hepatocellular Carcinoma Using Supervised Learning Algorithms

Authors: Sagri Sharma

Abstract:

Analysis of diseases integrating multi-factors increases the complexity of the problem and therefore, development of frameworks for the analysis of diseases is an issue that is currently a topic of intense research. Due to the inter-dependence of the various parameters, the use of traditional methodologies has not been very effective. Consequently, newer methodologies are being sought to deal with the problem. Supervised Learning Algorithms are commonly used for performing the prediction on previously unseen data. These algorithms are commonly used for applications in fields ranging from image analysis to protein structure and function prediction and they get trained using a known dataset to come up with a predictor model that generates reasonable predictions for the response to new data. Gene expression profiles generated by DNA analysis experiments can be quite complex since these experiments can involve hypotheses involving entire genomes. The application of well-known machine learning algorithm - Support Vector Machine - to analyze the expression levels of thousands of genes simultaneously in a timely, automated and cost effective way is thus used. The objectives to undertake the presented work are development of a methodology to identify genes relevant to Hepatocellular Carcinoma (HCC) from gene expression dataset utilizing supervised learning algorithms and statistical evaluations along with development of a predictive framework that can perform classification tasks on new, unseen data.

Keywords: artificial intelligence, biomarker, gene expression datasets, hepatocellular carcinoma, machine learning, supervised learning algorithms, support vector machine

Procedia PDF Downloads 429
639 Information Management Approach in the Prediction of Acute Appendicitis

Authors: Ahmad Shahin, Walid Moudani, Ali Bekraki

Abstract:

This research aims at presenting a predictive data mining model to handle an accurate diagnosis of acute appendicitis with patients for the purpose of maximizing the health service quality, minimizing morbidity/mortality, and reducing cost. However, acute appendicitis is the most common disease which requires timely accurate diagnosis and needs surgical intervention. Although the treatment of acute appendicitis is simple and straightforward, its diagnosis is still difficult because no single sign, symptom, laboratory or image examination accurately confirms the diagnosis of acute appendicitis in all cases. This contributes in increasing morbidity and negative appendectomy. In this study, the authors propose to generate an accurate model in prediction of patients with acute appendicitis which is based, firstly, on the segmentation technique associated to ABC algorithm to segment the patients; secondly, on applying fuzzy logic to process the massive volume of heterogeneous and noisy data (age, sex, fever, white blood cell, neutrophilia, CRP, urine, ultrasound, CT, appendectomy, etc.) in order to express knowledge and analyze the relationships among data in a comprehensive manner; and thirdly, on applying dynamic programming technique to reduce the number of data attributes. The proposed model is evaluated based on a set of benchmark techniques and even on a set of benchmark classification problems of osteoporosis, diabetes and heart obtained from the UCI data and other data sources.

Keywords: healthcare management, acute appendicitis, data mining, classification, decision tree

Procedia PDF Downloads 351
638 Bayesian Inference for High Dimensional Dynamic Spatio-Temporal Models

Authors: Sofia M. Karadimitriou, Kostas Triantafyllopoulos, Timothy Heaton

Abstract:

Reduced dimension Dynamic Spatio-Temporal Models (DSTMs) jointly describe the spatial and temporal evolution of a function observed subject to noise. A basic state space model is adopted for the discrete temporal variation, while a continuous autoregressive structure describes the continuous spatial evolution. Application of such a DSTM relies upon the pre-selection of a suitable reduced set of basic functions and this can present a challenge in practice. In this talk, we propose an online estimation method for high dimensional spatio-temporal data based upon DSTM and we attempt to resolve this issue by allowing the basis to adapt to the observed data. Specifically, we present a wavelet decomposition in order to obtain a parsimonious approximation of the spatial continuous process. This parsimony can be achieved by placing a Laplace prior distribution on the wavelet coefficients. The aim of using the Laplace prior, is to filter wavelet coefficients with low contribution, and thus achieve the dimension reduction with significant computation savings. We then propose a Hierarchical Bayesian State Space model, for the estimation of which we offer an appropriate particle filter algorithm. The proposed methodology is illustrated using real environmental data.

Keywords: multidimensional Laplace prior, particle filtering, spatio-temporal modelling, wavelets

Procedia PDF Downloads 428
637 Optimisation of B2C Supply Chain Resource Allocation

Authors: Firdaous Zair, Zoubir Elfelsoufi, Mohammed Fourka

Abstract:

The allocation of resources is an issue that is needed on the tactical and operational strategic plan. This work considers the allocation of resources in the case of pure players, manufacturers and Click & Mortars that have launched online sales. The aim is to improve the level of customer satisfaction and maintaining the benefits of e-retailer and of its cooperators and reducing costs and risks. Our contribution is a decision support system and tool for improving the allocation of resources in logistics chains e-commerce B2C context. We first modeled the B2C chain with all operations that integrates and possible scenarios since online retailers offer a wide selection of personalized service. The personalized services that online shopping companies offer to the clients can be embodied in many aspects, such as the customizations of payment, the distribution methods, and after-sales service choices. In addition, every aspect of customized service has several modes. At that time, we analyzed the optimization problems of supply chain resource allocation in customized online shopping service mode, which is different from the supply chain resource allocation under traditional manufacturing or service circumstances. Then we realized an optimization model and algorithm for the development based on the analysis of the allocation of the B2C supply chain resources. It is a multi-objective optimization that considers the collaboration of resources in operations, time and costs but also the risks and the quality of services as well as dynamic and uncertain characters related to the request.

Keywords: e-commerce, supply chain, B2C, optimisation, resource allocation

Procedia PDF Downloads 272
636 Multi-Level Air Quality Classification in China Using Information Gain and Support Vector Machine

Authors: Bingchun Liu, Pei-Chann Chang, Natasha Huang, Dun Li

Abstract:

Machine Learning and Data Mining are the two important tools for extracting useful information and knowledge from large datasets. In machine learning, classification is a wildly used technique to predict qualitative variables and is generally preferred over regression from an operational point of view. Due to the enormous increase in air pollution in various countries especially China, Air Quality Classification has become one of the most important topics in air quality research and modelling. This study aims at introducing a hybrid classification model based on information theory and Support Vector Machine (SVM) using the air quality data of four cities in China namely Beijing, Guangzhou, Shanghai and Tianjin from Jan 1, 2014 to April 30, 2016. China's Ministry of Environmental Protection has classified the daily air quality into 6 levels namely Serious Pollution, Severe Pollution, Moderate Pollution, Light Pollution, Good and Excellent based on their respective Air Quality Index (AQI) values. Using the information theory, information gain (IG) is calculated and feature selection is done for both categorical features and continuous numeric features. Then SVM Machine Learning algorithm is implemented on the selected features with cross-validation. The final evaluation reveals that the IG and SVM hybrid model performs better than SVM (alone), Artificial Neural Network (ANN) and K-Nearest Neighbours (KNN) models in terms of accuracy as well as complexity.

Keywords: machine learning, air quality classification, air quality index, information gain, support vector machine, cross-validation

Procedia PDF Downloads 235
635 Breast Cancer Survivability Prediction via Classifier Ensemble

Authors: Mohamed Al-Badrashiny, Abdelghani Bellaachia

Abstract:

This paper presents a classifier ensemble approach for predicting the survivability of the breast cancer patients using the latest database version of the Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute. The system consists of two main components; features selection and classifier ensemble components. The features selection component divides the features in SEER database into four groups. After that it tries to find the most important features among the four groups that maximizes the weighted average F-score of a certain classification algorithm. The ensemble component uses three different classifiers, each of which models different set of features from SEER through the features selection module. On top of them, another classifier is used to give the final decision based on the output decisions and confidence scores from each of the underlying classifiers. Different classification algorithms have been examined; the best setup found is by using the decision tree, Bayesian network, and Na¨ıve Bayes algorithms for the underlying classifiers and Na¨ıve Bayes for the classifier ensemble step. The system outperforms all published systems to date when evaluated against the exact same data of SEER (period of 1973-2002). It gives 87.39% weighted average F-score compared to 85.82% and 81.34% of the other published systems. By increasing the data size to cover the whole database (period of 1973-2014), the overall weighted average F-score jumps to 92.4% on the held out unseen test set.

Keywords: classifier ensemble, breast cancer survivability, data mining, SEER

Procedia PDF Downloads 329
634 Research on the Optimization of the Facility Layout of Efficient Cafeterias for Troops

Authors: Qing Zhang, Jiachen Nie, Yujia Wen, Guanyuan Kou, Peng Yu, Kun Xia, Qin Yang, Li Ding

Abstract:

BACKGROUND: A facility layout problem (FLP) is an NP-complete (non-deterministic polynomial) problem, which is hard to obtain an exact optimal solution. FLP has been widely studied in various limited spaces and workflows. For example, cafeterias with many types of equipment for troops cause chaotic processes when dining. OBJECTIVE: This article tried to optimize the layout of troops’ cafeteria and to improve the overall efficiency of the dining process. METHODS: First, the original cafeteria layout design scheme was analyzed from an ergonomic perspective and two new design schemes were generated. Next, three facility layout models were designed, and further simulation was applied to compare the total time and density of troops between each scheme. Last, an experiment of the dining process with video observation and analysis verified the simulation results. RESULTS: In a simulation, the dining time under the second new layout is shortened by 2.25% and 1.89% (p<0.0001, p=0.0001) compared with the other two layouts, while troops-flow density and interference both greatly reduced in the two new layouts. In the experiment, process completing time and the number of interference reduced as well, which verified corresponding simulation results. CONCLUSIONS: Our two new layout schemes are tested to be optimal by a series of simulation and space experiments. In future research, similar approaches could be applied when taking layout-design algorithm calculation into consideration.

Keywords: layout optimization, dining efficiency, troops’ cafeteria, anylogic simulation, field experiment

Procedia PDF Downloads 143
633 Highly Robust Crosslinked BIAN-based Binder to Stabilize High-Performance Silicon Anode in Lithium-Ion Secondary Battery

Authors: Agman Gupta, Rajashekar Badam, Noriyoshi Matsumi

Abstract:

Introduction: Recently, silicon has been recognized as one of the potential alternatives as anode active material in Li-ion batteries (LIBs) to replace the conventionally used graphite anodes. Silicon is abundantly present in the nature, it can alloy with lithium metal, and has a higher theoretical capacity (~4200 mAhg-1) that is approximately 10 times higher than graphite. However, because of a large volume expansion (~400%) upon repeated de-/alloying, the pulverization of Si particles causes the exfoliation of electrode laminate leading to the loss of electrical contact and adversely affecting the formation of solid-electrolyte interface (SEI).1 Functional polymers as binders have emerged as a competitive strategy to mitigate these drawbacks and failure mechanism of silicon anodes.1 A variety of aqueous/non-aqueous polymer binders like sodium carboxy-methyl cellulose (CMC-Na), styrene butadiene rubber (SBR), poly(acrylic acid), and other variants like mussel inspired binders have been investigated to overcome these drawbacks.1 However, there are only a few reports that mention the attempt of addressing all the drawbacks associated with silicon anodes effectively using a single novel functional polymer system as a binder. In this regard, here, we report a novel highly robust n-type bisiminoacenaphthenequinone (BIAN)-paraphenylene-based crosslinked polymer as a binder for Si anodes in lithium-ion batteries (Fig. 1). On its application, crosslinked-BIAN binder was evaluated to provide mechanical robustness to the large volume expansion of Si particles, maintain electrical conductivity within the electrode laminate, and facilitate in the formation of a thin SEI by restricting the extent of electrolyte decomposition on the surface of anode. The fabricated anodic half-cells were evaluated electrochemically for their rate capability, cyclability, and discharge capacity. Experimental: The polymerized BIAN (P-BIAN) copolymer was synthesized as per the procedure reported by our group.2 The synthesis of crosslinked P-BIAN: a solution of P-BIAN copolymer (1.497 g, 10 mmol) in N-methylpyrrolidone (NMP) (150 ml) was set-up to stir under reflux in nitrogen atmosphere. To this, 1,6-dibromohexane (5 mmol, 0.77 ml) was added dropwise. The resultant reaction mixture was stirred and refluxed at 150 °C for 24 hours followed by refrigeration for 3 hours at 5 °C. The product was obtained by evaporating the NMP solvent under reduced pressure and drying under vacuum at 120 °C for 12 hours. The obtained product was a black colored sticky compound. It was characterized by 1H-NMR, XPS, and FT-IR techniques. Results and Discussion: The N 1s XPS spectrum of the crosslinked BIAN polymer showed two characteristic peaks corresponding to the sp2 hybridized nitrogen (-C=N-) at 399.6 eV of the diimine backbone in the BP and quaternary nitrogen at 400.7 eV corresponding to the crosslinking of BP via dibromohexane. The DFT evaluation of the crosslinked BIAN binder showed that it has a low lying lowest unoccupied molecular orbital (LUMO) that enables it to get doped in the reducing environment and influence the formation of a thin (SEI). Therefore, due to the mechanically robust crosslinked matrices as well as its influence on the formation of a thin SEI, the crosslinked BIAN binder stabilized the Si anode-based half-cell for over 1000 cycles with a reversible capacity of ~2500 mAhg-1 and ~99% capacity retention as shown in Fig. 2. The dynamic electrochemical impedance spectroscopy (DEIS) characterization of crosslinked BIAN-based anodic half-cell confirmed that the SEI formed was thin in comparison with the conventional binder-based anodes. Acknowledgement: We are thankful to the financial support provided by JST-Mirai Program, Grant Number: JP18077239

Keywords: self-healing binder, n-type binder, thin solid-electrolyte interphase (SEI), high-capacity silicon anodes, low-LUMO

Procedia PDF Downloads 170
632 Modeling Stream Flow with Prediction Uncertainty by Using SWAT Hydrologic and RBNN Neural Network Models for Agricultural Watershed in India

Authors: Ajai Singh

Abstract:

Simulation of hydrological processes at the watershed outlet through modelling approach is essential for proper planning and implementation of appropriate soil conservation measures in Damodar Barakar catchment, Hazaribagh, India where soil erosion is a dominant problem. This study quantifies the parametric uncertainty involved in simulation of stream flow using Soil and Water Assessment Tool (SWAT), a watershed scale model and Radial Basis Neural Network (RBNN), an artificial neural network model. Both the models were calibrated and validated based on measured stream flow and quantification of the uncertainty in SWAT model output was assessed using ‘‘Sequential Uncertainty Fitting Algorithm’’ (SUFI-2). Though both the model predicted satisfactorily, but RBNN model performed better than SWAT with R2 and NSE values of 0.92 and 0.92 during training, and 0.71 and 0.70 during validation period, respectively. Comparison of the results of the two models also indicates a wider prediction interval for the results of the SWAT model. The values of P-factor related to each model shows that the percentage of observed stream flow values bracketed by the 95PPU in the RBNN model as 91% is higher than the P-factor in SWAT as 87%. In other words the RBNN model estimates the stream flow values more accurately and with less uncertainty. It could be stated that RBNN model based on simple input could be used for estimation of monthly stream flow, missing data, and testing the accuracy and performance of other models.

Keywords: SWAT, RBNN, SUFI 2, bootstrap technique, stream flow, simulation

Procedia PDF Downloads 370
631 PathoPy2.0: Application of Fractal Geometry for Early Detection and Histopathological Analysis of Lung Cancer

Authors: Rhea Kapoor

Abstract:

Fractal dimension provides a way to characterize non-geometric shapes like those found in nature. The purpose of this research is to estimate Minkowski fractal dimension of human lung images for early detection of lung cancer. Lung cancer is the leading cause of death among all types of cancer and an early histopathological analysis will help reduce deaths primarily due to late diagnosis. A Python application program, PathoPy2.0, was developed for analyzing medical images in pixelated format and estimating Minkowski fractal dimension using a new box-counting algorithm that allows windowing of images for more accurate calculation in the suspected areas of cancerous growth. Benchmark geometric fractals were used to validate the accuracy of the program and changes in fractal dimension of lung images to indicate the presence of issues in the lung. The accuracy of the program for the benchmark examples was between 93-99% of known values of the fractal dimensions. Fractal dimension values were then calculated for lung images, from National Cancer Institute, taken over time to correctly detect the presence of cancerous growth. For example, as the fractal dimension for a given lung increased from 1.19 to 1.27 due to cancerous growth, it represents a significant change in fractal dimension which lies between 1 and 2 for 2-D images. Based on the results obtained on many lung test cases, it was concluded that fractal dimension of human lungs can be used to diagnose lung cancer early. The ideas behind PathoPy2.0 can also be applied to study patterns in the electrical activity of the human brain and DNA matching.

Keywords: fractals, histopathological analysis, image processing, lung cancer, Minkowski dimension

Procedia PDF Downloads 178
630 Multimodal Database of Retina Images for Africa: The First Open Access Digital Repository for Retina Images in Sub Saharan Africa

Authors: Simon Arunga, Teddy Kwaga, Rita Kageni, Michael Gichangi, Nyawira Mwangi, Fred Kagwa, Rogers Mwavu, Amos Baryashaba, Luis F. Nakayama, Katharine Morley, Michael Morley, Leo A. Celi, Jessica Haberer, Celestino Obua

Abstract:

Purpose: The main aim for creating the Multimodal Database of Retinal Images for Africa (MoDRIA) was to provide a publicly available repository of retinal images for responsible researchers to conduct algorithm development in a bid to curb the challenges of ophthalmic artificial intelligence (AI) in Africa. Methods: Data and retina images were ethically sourced from sites in Uganda and Kenya. Data on medical history, visual acuity, ocular examination, blood pressure, and blood sugar were collected. Retina images were captured using fundus cameras (Foru3-nethra and Canon CR-Mark-1). Images were stored on a secure online database. Results: The database consists of 7,859 retinal images in portable network graphics format from 1,988 participants. Images from patients with human immunodeficiency virus were 18.9%, 18.2% of images were from hypertensive patients, 12.8% from diabetic patients, and the rest from normal’ participants. Conclusion: Publicly available data repositories are a valuable asset in the development of AI technology. Therefore, is a need for the expansion of MoDRIA so as to provide larger datasets that are more representative of Sub-Saharan data.

Keywords: retina images, MoDRIA, image repository, African database

Procedia PDF Downloads 127
629 Research on Detection of Web Page Visual Salience Region Based on Eye Tracker and Spectral Residual Model

Authors: Xiaoying Guo, Xiangyun Wang, Chunhua Jia

Abstract:

Web page has been one of the most important way of knowing the world. Humans catch a lot of information from it everyday. Thus, understanding where human looks when they surfing the web pages is rather important. In normal scenes, the down-top features and top-down tasks significantly affect humans’ eye movement. In this paper, we investigated if the conventional visual salience algorithm can properly predict humans’ visual attractive region when they viewing the web pages. First, we obtained the eye movement data when the participants viewing the web pages using an eye tracker. By the analysis of eye movement data, we studied the influence of visual saliency and thinking way on eye-movement pattern. The analysis result showed that thinking way affect human’ eye-movement pattern much more than visual saliency. Second, we compared the results of web page visual salience region extracted by Itti model and Spectral Residual (SR) model. The results showed that Spectral Residual (SR) model performs superior than Itti model by comparison with the heat map from eye movements. Considering the influence of mind habit on humans’ visual region of interest, we introduced one of the most important cue in mind habit-fixation position to improved the SR model. The result showed that the improved SR model can better predict the human visual region of interest in web pages.

Keywords: web page salience region, eye-tracker, spectral residual, visual salience

Procedia PDF Downloads 276
628 Improved Classification Procedure for Imbalanced and Overlapped Situations

Authors: Hankyu Lee, Seoung Bum Kim

Abstract:

The issue with imbalance and overlapping in the class distribution becomes important in various applications of data mining. The imbalanced dataset is a special case in classification problems in which the number of observations of one class (i.e., major class) heavily exceeds the number of observations of the other class (i.e., minor class). Overlapped dataset is the case where many observations are shared together between the two classes. Imbalanced and overlapped data can be frequently found in many real examples including fraud and abuse patients in healthcare, quality prediction in manufacturing, text classification, oil spill detection, remote sensing, and so on. The class imbalance and overlap problem is the challenging issue because this situation degrades the performance of most of the standard classification algorithms. In this study, we propose a classification procedure that can effectively handle imbalanced and overlapped datasets by splitting data space into three parts: nonoverlapping, light overlapping, and severe overlapping and applying the classification algorithm in each part. These three parts were determined based on the Hausdorff distance and the margin of the modified support vector machine. An experiments study was conducted to examine the properties of the proposed method and compared it with other classification algorithms. The results showed that the proposed method outperformed the competitors under various imbalanced and overlapped situations. Moreover, the applicability of the proposed method was demonstrated through the experiment with real data.

Keywords: classification, imbalanced data with class overlap, split data space, support vector machine

Procedia PDF Downloads 308
627 Patent on Brian: Brain Waves Stimulation

Authors: Jalil Qoulizadeh, Hasan Sadeghi

Abstract:

Brain waves are electrical wave patterns that are produced in the human brain. Knowing these waves and activating them can have a positive effect on brain function and ultimately create an ideal life. The brain has the ability to produce waves from 0.1 to above 65 Hz. (The Beta One device produces exactly these waves) This is because it is said that the waves produced by the Beta One device exactly match the waves produced by the brain. The function and method of this device is based on the magnetic stimulation of the brain. The technology used in the design and producƟon of this device works in a way to strengthen and improve the frequencies of brain waves with a pre-defined algorithm according to the type of requested function, so that the person can access the expected functions in life activities. to perform better. The effect of this field on neurons and their stimulation: In order to evaluate the effect of this field created by the device, on the neurons, the main tests are by conducting electroencephalography before and after stimulation and comparing these two baselines by qEEG or quantitative electroencephalography method using paired t-test in 39 subjects. It confirms the significant effect of this field on the change of electrical activity recorded after 30 minutes of stimulation in all subjects. The Beta One device is able to induce the appropriate pattern of the expected functions in a soft and effective way to the brain in a healthy and effective way (exactly in accordance with the harmony of brain waves), the process of brain activities first to a normal state and then to a powerful one. Production of inexpensive neuroscience equipment (compared to existing rTMS equipment) Magnetic brain stimulation for clinics - homes - factories and companies - professional sports clubs.

Keywords: stimulation, brain, waves, betaOne

Procedia PDF Downloads 81
626 Getting Out of the Box: Tangible Music Production in the Age of Virtual Technological Abundance

Authors: Tim Nikolsky

Abstract:

This paper seeks to explore the different ways in which music producers choose to embrace various levels of technology based on musical values, objectives, affordability, access and workflow benefits. Current digital audio production workflow is questioned. Engineers and music producers of today are increasingly divorced from the tangibility of music production. Making music no longer requires you to reach over and turn a knob. Ideas of authenticity in music production are being redefined. Calculations from the mathematical algorithm with the pretty pictures are increasingly being chosen over hardware containing transformers and tubes. Are mouse clicks and movements equivalent or inferior to the master brush strokes we are seeking to conjure? We are making audio production decisions visually by constantly looking at a screen rather than listening. Have we compromised our music objectives and values by removing the ‘hands-on’ nature of music making? DAW interfaces are making our musical decisions for us not necessarily in our best interests. Technological innovation has presented opportunities as well as challenges for education. What do music production students actually need to learn in a formalised education environment, and to what extent do they need to know it? In this brave new world of omnipresent music creation tools, do we still need tangibility in music production? Interviews with prominent Australian music producers that work in a variety of fields will be featured in this paper, and will provide insight in answering these questions and move towards developing an understanding how tangibility can be rediscovered in the next generation of music production.

Keywords: analogue, digital, digital audio workstation, music production, plugins, tangibility, technology, workflow

Procedia PDF Downloads 271
625 Increasing of Gain in Unstable Thin Disk Resonator

Authors: M. Asl. Dehghan, M. H. Daemi, S. Radmard, S. H. Nabavi

Abstract:

Thin disk lasers are engineered for efficient thermal cooling and exhibit superior performance for this task. However the disk thickness and large pumped area make the use of this gain format in a resonator difficult when constructing a single-mode laser. Choosing an unstable resonator design is beneficial for this purpose. On the other hand, the low gain medium restricts the application of unstable resonators to low magnifications and therefore to a poor beam quality. A promising idea to enable the application of unstable resonators to wide aperture, low gain lasers is to couple a fraction of the out coupled radiation back into the resonator. The output coupling gets dependent on the ratio of the back reflection and can be adjusted independently from the magnification. The excitation of the converging wave can be done by the use of an external reflector. The resonator performance is numerically predicted. First of all the threshold condition of linear, V and 2V shape resonator is investigated. Results show that the maximum magnification is 1.066 that is very low for high quality purposes. Inserting an additional reflector covers the low gain. The reflectivity and the related magnification of a 350 micron Yb:YAG disk are calculated. The theoretical model was based on the coupled Kirchhoff integrals and solved numerically by the Fox and Li algorithm. Results show that with back reflection mechanism in combination with increasing the number of beam incidents on disk, high gain and high magnification can occur.

Keywords: unstable resonators, thin disk lasers, gain, external reflector

Procedia PDF Downloads 413
624 Noise Reduction in Web Data: A Learning Approach Based on Dynamic User Interests

Authors: Julius Onyancha, Valentina Plekhanova

Abstract:

One of the significant issues facing web users is the amount of noise in web data which hinders the process of finding useful information in relation to their dynamic interests. Current research works consider noise as any data that does not form part of the main web page and propose noise web data reduction tools which mainly focus on eliminating noise in relation to the content and layout of web data. This paper argues that not all data that form part of the main web page is of a user interest and not all noise data is actually noise to a given user. Therefore, learning of noise web data allocated to the user requests ensures not only reduction of noisiness level in a web user profile, but also a decrease in the loss of useful information hence improves the quality of a web user profile. Noise Web Data Learning (NWDL) tool/algorithm capable of learning noise web data in web user profile is proposed. The proposed work considers elimination of noise data in relation to dynamic user interest. In order to validate the performance of the proposed work, an experimental design setup is presented. The results obtained are compared with the current algorithms applied in noise web data reduction process. The experimental results show that the proposed work considers the dynamic change of user interest prior to elimination of noise data. The proposed work contributes towards improving the quality of a web user profile by reducing the amount of useful information eliminated as noise.

Keywords: web log data, web user profile, user interest, noise web data learning, machine learning

Procedia PDF Downloads 265
623 Coding and Decoding versus Space Diversity for ‎Rayleigh Fading Radio Frequency Channels ‎

Authors: Ahmed Mahmoud Ahmed Abouelmagd

Abstract:

The diversity is the usual remedy of the transmitted signal level variations (Fading phenomena) in radio frequency channels. Diversity techniques utilize two or more copies of a signal and combine those signals to combat fading. The basic concept of diversity is to transmit the signal via several independent diversity branches to get independent signal replicas via time – frequency - space - and polarization diversity domains. Coding and decoding processes can be an alternative remedy for fading phenomena, it cannot increase the channel capacity, but it can improve the error performance. In this paper we propose the use of replication decoding with BCH code class, and Viterbi decoding algorithm with convolution coding; as examples of coding and decoding processes. The results are compared to those obtained from two optimized selection space diversity techniques. The performance of Rayleigh fading channel, as the model considered for radio frequency channels, is evaluated for each case. The evaluation results show that the coding and decoding approaches, especially the BCH coding approach with replication decoding scheme, give better performance compared to that of selection space diversity optimization approaches. Also, an approach for combining the coding and decoding diversity as well as the space diversity is considered, the main disadvantage of this approach is its complexity but it yields good performance results.

Keywords: Rayleigh fading, diversity, BCH codes, Replication decoding, ‎convolution coding, viterbi decoding, space diversity

Procedia PDF Downloads 443
622 Development of Precise Ephemeris Generation Module for Thaichote Satellite Operations

Authors: Manop Aorpimai, Ponthep Navakitkanok

Abstract:

In this paper, the development of the ephemeris generation module used for the Thaichote satellite operations is presented. It is a vital part of the flight dynamics system, which comprises, the orbit determination, orbit propagation, event prediction and station-keeping maneuver modules. In the generation of the spacecraft ephemeris data, the estimated orbital state vector from the orbit determination module is used as an initial condition. The equations of motion are then integrated forward in time to predict the satellite states. The higher geopotential harmonics, as well as other disturbing forces, are taken into account to resemble the environment in low-earth orbit. Using a highly accurate numerical integrator based on the Burlish-Stoer algorithm the ephemeris data can be generated for long-term predictions, by using a relatively small computation burden and short calculation time. Some events occurring during the prediction course that are related to the mission operations, such as the satellite’s rise/set viewed from the ground station, Earth and Moon eclipses, the drift in ground track as well as the drift in the local solar time of the orbital plane are all detected and reported. When combined with other modules to form a flight dynamics system, this application is aimed to be applied for the Thaichote satellite and successive Thailand’s Earth-observation missions.

Keywords: flight dynamics system, orbit propagation, satellite ephemeris, Thailand’s Earth Observation Satellite

Procedia PDF Downloads 377
621 Introduce a New Model of Anomaly Detection in Computer Networks Using Artificial Immune Systems

Authors: Mehrshad Khosraviani, Faramarz Abbaspour Leyl Abadi

Abstract:

The fundamental component of the computer network of modern information society will be considered. These networks are connected to the network of the internet generally. Due to the fact that the primary purpose of the Internet is not designed for, in recent decades, none of these networks in many of the attacks has been very important. Today, for the provision of security, different security tools and systems, including intrusion detection systems are used in the network. A common diagnosis system based on artificial immunity, the designer, the Adhasaz Foundation has been evaluated. The idea of using artificial safety methods in the diagnosis of abnormalities in computer networks it has been stimulated in the direction of their specificity, there are safety systems are similar to the common needs of m, that is non-diagnostic. For example, such methods can be used to detect any abnormalities, a variety of attacks, being memory, learning ability, and Khodtnzimi method of artificial immune algorithm pointed out. Diagnosis of the common system of education offered in this paper using only the normal samples is required for network and any additional data about the type of attacks is not. In the proposed system of positive selection and negative selection processes, selection of samples to create a distinction between the colony of normal attack is used. Copa real data collection on the evaluation of ij indicates the proposed system in the false alarm rate is often low compared to other ir methods and the detection rate is in the variations.

Keywords: artificial immune system, abnormality detection, intrusion detection, computer networks

Procedia PDF Downloads 353
620 Comparison of Two Maintenance Policies for a Two-Unit Series System Considering General Repair

Authors: Seyedvahid Najafi, Viliam Makis

Abstract:

In recent years, maintenance optimization has attracted special attention due to the growth of industrial systems complexity. Maintenance costs are high for many systems, and preventive maintenance is effective when it increases operations' reliability and safety at a reduced cost. The novelty of this research is to consider general repair in the modeling of multi-unit series systems and solve the maintenance problem for such systems using the semi-Markov decision process (SMDP) framework. We propose an opportunistic maintenance policy for a series system composed of two main units. Unit 1, which is more expensive than unit 2, is subjected to condition monitoring, and its deterioration is modeled using a gamma process. Unit 1 hazard rate is estimated by the proportional hazards model (PHM), and two hazard rate control limits are considered as the thresholds of maintenance interventions for unit 1. Maintenance is performed on unit 2, considering an age control limit. The objective is to find the optimal control limits and minimize the long-run expected average cost per unit time. The proposed algorithm is applied to a numerical example to compare the effectiveness of the proposed policy (policy Ⅰ) with policy Ⅱ, which is similar to policy Ⅰ, but instead of general repair, replacement is performed. Results show that policy Ⅰ leads to lower average cost compared with policy Ⅱ. 

Keywords: condition-based maintenance, proportional hazards model, semi-Markov decision process, two-unit series systems

Procedia PDF Downloads 123
619 The Use of Remotely Sensed Data to Extract Wetlands Area in the Cultural Park of Ahaggar, South of Algeria

Authors: Y. Fekir, K. Mederbal, M. A. Hammadouche, D. Anteur

Abstract:

The cultural park of the Ahaggar, occupying a large area of Algeria, is characterized by a rich wetlands area to be preserved and managed both in time and space. The management of a large area, by its complexity, needs large amounts of data, which for the most part, are spatially localized (DEM, satellite images and socio-economic information...), where the use of conventional and traditional methods is quite difficult. The remote sensing, by its efficiency in environmental applications, became an indispensable solution for this kind of studies. Remote sensing imaging data have been very useful in the last decade in very interesting applications. They can aid in several domains such as the detection and identification of diverse wetland surface targets, topographical details, and geological features... In this work, we try to extract automatically wetlands area using multispectral remotely sensed data on-board the Earth Observing 1 (EO-1) and Landsat satellite. Both are high-resolution multispectral imager with a 30 m resolution. The instrument images an interesting surface area. We have used images acquired over the several area of interesting in the National Park of Ahaggar in the south of Algeria. An Extraction Algorithm is applied on the several spectral index obtained from combination of different spectral bands to extract wetlands fraction occupation of land use. The obtained results show an accuracy to distinguish wetlands area from the other lad use themes using a fine exploitation on spectral index.

Keywords: multispectral data, EO1, landsat, wetlands, Ahaggar, Algeria

Procedia PDF Downloads 377
618 A Pipeline for Detecting Copy Number Variation from Whole Exome Sequencing Using Comprehensive Tools

Authors: Cheng-Yang Lee, Petrus Tang, Tzu-Hao Chang

Abstract:

Copy number variations (CNVs) have played an important role in many kinds of human diseases, such as Autism, Schizophrenia and a number of cancers. Many diseases are found in genome coding regions and whole exome sequencing (WES) is a cost-effective and powerful technology in detecting variants that are enriched in exons and have potential applications in clinical setting. Although several algorithms have been developed to detect CNVs using WES and compared with other algorithms for finding the most suitable methods using their own samples, there were not consistent datasets across most of algorithms to evaluate the ability of CNV detection. On the other hand, most of algorithms is using command line interface that may greatly limit the analysis capability of many laboratories. We create a series of simulated WES datasets from UCSC hg19 chromosome 22, and then evaluate the CNV detective ability of 19 algorithms from OMICtools database using our simulated WES datasets. We compute the sensitivity, specificity and accuracy in each algorithm for validation of the exome-derived CNVs. After comparison of 19 algorithms from OMICtools database, we construct a platform to install all of the algorithms in a virtual machine like VirtualBox which can be established conveniently in local computers, and then create a simple script that can be easily to use for detecting CNVs using algorithms selected by users. We also build a table to elaborate on many kinds of events, such as input requirement, CNV detective ability, for all of the algorithms that can provide users a specification to choose optimum algorithms.

Keywords: whole exome sequencing, copy number variations, omictools, pipeline

Procedia PDF Downloads 319
617 Classification of Business Models of Italian Bancassurance by Balance Sheet Indicators

Authors: Andrea Bellucci, Martina Tofi

Abstract:

The aim of paper is to analyze business models of bancassurance in Italy for life business. The life insurance business is very developed in the Italian market and banks branches have 80% of the market share. Given its maturity, the life insurance market needs to consolidate its organizational form to allow for the development of non-life business, which nowadays collects few premiums but represents a great opportunity to enlarge the market share of bancassurance using its strength in the distribution channel while the market share of independent agents is decreasing. Starting with the main business model of bancassurance for life business, this paper will analyze the performances of life companies in the Italian market by balance sheet indicators and by main discriminant variables of business models. The study will observe trends from 2013 to 2015 for the Italian market by exploiting a database managed by Associazione Nazionale delle Imprese di Assicurazione (ANIA). The applied approach is based on a bottom-up analysis starting with variables and indicators to define business models’ classification. The statistical classification algorithm proposed by Ward is employed to design business models’ profiles. Results from the analysis will be a representation of the main business models built by their profile related to indicators. In that way, an unsupervised analysis is developed that has the limit of its judgmental dimension based on research opinion, but it is possible to obtain a design of effective business models.

Keywords: bancassurance, business model, non life bancassurance, insurance business value drivers

Procedia PDF Downloads 299
616 Full-Field Estimation of Cyclic Threshold Shear Strain

Authors: E. E. S. Uy, T. Noda, K. Nakai, J. R. Dungca

Abstract:

Cyclic threshold shear strain is the cyclic shear strain amplitude that serves as the indicator of the development of pore water pressure. The parameter can be obtained by performing either cyclic triaxial test, shaking table test, cyclic simple shear or resonant column. In a cyclic triaxial test, other researchers install measuring devices in close proximity of the soil to measure the parameter. In this study, an attempt was made to estimate the cyclic threshold shear strain parameter using full-field measurement technique. The technique uses a camera to monitor and measure the movement of the soil. For this study, the technique was incorporated in a strain-controlled consolidated undrained cyclic triaxial test. Calibration of the camera was first performed to ensure that the camera can properly measure the deformation under cyclic loading. Its capacity to measure deformation was also investigated using a cylindrical rubber dummy. Two-dimensional image processing was implemented. Lucas and Kanade optical flow algorithm was applied to track the movement of the soil particles. Results from the full-field measurement technique were compared with the results from the linear variable displacement transducer. A range of values was determined from the estimation. This was due to the nonhomogeneous deformation of the soil observed during the cyclic loading. The minimum values were in the order of 10-2% in some areas of the specimen.

Keywords: cyclic loading, cyclic threshold shear strain, full-field measurement, optical flow

Procedia PDF Downloads 235