Search results for: bagging ensemble methods
14313 Morphological and Molecular Studies (ITS1) of Hydatid Cysts in Slaughtered Sheep in Mashhad Area
Authors: G. R. Hashemi Tabar, G. R. Razmi, F. Mirshekar
Abstract:
Echinococcus granulosus have ten strains from G1 to G9. Each strain is related to special intermediated host. The morphology, epidemiology, treatment and control in these strains are different. There are many morphological and molecular methods to differentiate of Echinococcus strains. However, using both methods were provided better information about identification of each strain. The aim of study was to identify Echinococcus granulosus strain of hydrated cysts in slaughtered sheep using morphological and molecular methods in Mashhad area. In the present study, the infected liver and lung with hydatid cysts were collected and transferred to laboratory. The hydatid cyst liquid was extracted and morphological characters of rostellar hook protosclocies were measured using micrometer ocular. The total length of large blade length of large hooks, total length of small and blade length of small hooks, and number of hooks per protoscolex were 23± 0.3μm, 11.7±0.5 μm, 19.3±1.1 μm,8±1.1 and 33.7±0.7 μm, respectively. In molecular section of the study, DNA each samples was extracted with MBST Kit and development of PCR using special primers (EgF, EgR) which amplify fragment of ITS1 gen. The PCR product was digested with Bsh1236I enzyme. Based on pattern of PCR-RLFP results (four band forming), G1, G2 and G3 strain of Echinococcus granulosus were obtained. Differentiation of three strains was done using sequencing analysis and G1 strain was diagnosed. The agreement between the molecular results with morphometric characters of rosetellar hook was confirmed the presence of G1 strain of Echinococcus in the slaughtered sheep of Mashhad area.Keywords: Echinococcus granulosus, Hydatid cyst, PCR, sheep
Procedia PDF Downloads 51814312 Creative Application of Cognitive Linguistics and Communicative Methods to Eliminate Common Learners' Mistakes in Academic Essay Writing
Authors: Ekaterina Lukianchenko
Abstract:
This article sums up a six-year experience of teaching English as a foreign language to over 900 university students at MGIMO (Moscow University of International Relations, Russia), all of them native speakers of Russian aged 16 to 23. By combining modern communicative approach to teaching with cognitive linguistics theories, one can deal more effectively with deeply rooted mistakes which particular students have of which conventional methods have failed to eliminate. If language items are understood as concepts and frames, and classroom activities as meaningful parts of language competence development, this might help to solve such problems as incorrect use of words, unsuitable register, and confused tenses - as well as logical or structural mistakes, and even certain psychological issues concerning essay writing. Along with classic teaching methods, such classroom practice includes plenty of interaction between students - playing special classroom games aimed at eliminating particular mistakes, working in pairs and groups, integrating all skills in one class. The main conclusions that the author of the experiment makes consist in an assumption that academic essay writing classes demand a balanced plan. This should not only include writing as such, but additionally feature elements of listening, reading, speaking activities specifically chosen according to the skills and language students will need to write the particular type of essay.Keywords: academic essay writing, creative teaching, cognitive linguistics, competency-based approach, communicative language teaching, frame, concept
Procedia PDF Downloads 29714311 Evaluating Multiple Diagnostic Tests: An Application to Cervical Intraepithelial Neoplasia
Authors: Areti Angeliki Veroniki, Sofia Tsokani, Evangelos Paraskevaidis, Dimitris Mavridis
Abstract:
The plethora of diagnostic test accuracy (DTA) studies has led to the increased use of systematic reviews and meta-analysis of DTA studies. Clinicians and healthcare professionals often consult DTA meta-analyses to make informed decisions regarding the optimum test to choose and use for a given setting. For example, the human papilloma virus (HPV) DNA, mRNA, and cytology can be used for the cervical intraepithelial neoplasia grade 2+ (CIN2+) diagnosis. But which test is the most accurate? Studies directly comparing test accuracy are not always available, and comparisons between multiple tests create a network of DTA studies that can be synthesized through a network meta-analysis of diagnostic tests (DTA-NMA). The aim is to summarize the DTA-NMA methods for at least three index tests presented in the methodological literature. We illustrate the application of the methods using a real data set for the comparative accuracy of HPV DNA, HPV mRNA, and cytology tests for cervical cancer. A search was conducted in PubMed, Web of Science, and Scopus from inception until the end of July 2019 to identify full-text research articles that describe a DTA-NMA method for three or more index tests. Since the joint classification of the results from one index against the results of another index test amongst those with the target condition and amongst those without the target condition are rarely reported in DTA studies, only methods requiring the 2x2 tables of the results of each index test against the reference standard were included. Studies of any design published in English were eligible for inclusion. Relevant unpublished material was also included. Ten relevant studies were finally included to evaluate their methodology. DTA-NMA methods that have been presented in the literature together with their advantages and disadvantages are described. In addition, using 37 studies for cervical cancer obtained from a published Cochrane review as a case study, an application of the identified DTA-NMA methods to determine the most promising test (in terms of sensitivity and specificity) for use as the best screening test to detect CIN2+ is presented. As a conclusion, different approaches for the comparative DTA meta-analysis of multiple tests may conclude to different results and hence may influence decision-making. Acknowledgment: This research is co-financed by Greece and the European Union (European Social Fund- ESF) through the Operational Programme «Human Resources Development, Education and Lifelong Learning 2014-2020» in the context of the project “Extension of Network Meta-Analysis for the Comparison of Diagnostic Tests ” (MIS 5047640).Keywords: colposcopy, diagnostic test, HPV, network meta-analysis
Procedia PDF Downloads 13914310 Real-Time Nonintrusive Heart Rate Measurement: Comparative Case Study of LED Sensorics' Accuracy and Benefits in Heart Monitoring
Authors: Goran Begović
Abstract:
In recent years, many researchers are focusing on non-intrusive measuring methods when it comes to human biosignals. These methods provide solutions for everyday use, whether it’s health monitoring or finessing the workout routine. One of the biggest issues with these solutions is that the sensors’ accuracy is highly variable due to many factors, such as ambiental light, skin color diversity, etc. That is why we wanted to explore different outcomes under those kinds of circumstances in order to find the most optimal algorithm(s) for extracting heart rate (HR) information. The optimization of such algorithms can benefit the wider, cheaper, and safer application of home health monitoring, without having to visit medical professionals as often when it comes to observing heart irregularities. In this study, we explored the accuracy of infrared (IR), red, and green LED sensorics in a controlled environment and compared the results with a medically accurate ECG monitoring device.Keywords: data science, ECG, heart rate, holter monitor, LED sensors
Procedia PDF Downloads 12714309 A Review of Effective Gene Selection Methods for Cancer Classification Using Microarray Gene Expression Profile
Authors: Hala Alshamlan, Ghada Badr, Yousef Alohali
Abstract:
Cancer is one of the dreadful diseases, which causes considerable death rate in humans. DNA microarray-based gene expression profiling has been emerged as an efficient technique for cancer classification, as well as for diagnosis, prognosis, and treatment purposes. In recent years, a DNA microarray technique has gained more attraction in both scientific and in industrial fields. It is important to determine the informative genes that cause cancer to improve early cancer diagnosis and to give effective chemotherapy treatment. In order to gain deep insight into the cancer classification problem, it is necessary to take a closer look at the proposed gene selection methods. We believe that they should be an integral preprocessing step for cancer classification. Furthermore, finding an accurate gene selection method is a very significant issue in a cancer classification area because it reduces the dimensionality of microarray dataset and selects informative genes. In this paper, we classify and review the state-of-art gene selection methods. We proceed by evaluating the performance of each gene selection approach based on their classification accuracy and number of informative genes. In our evaluation, we will use four benchmark microarray datasets for the cancer diagnosis (leukemia, colon, lung, and prostate). In addition, we compare the performance of gene selection method to investigate the effective gene selection method that has the ability to identify a small set of marker genes, and ensure high cancer classification accuracy. To the best of our knowledge, this is the first attempt to compare gene selection approaches for cancer classification using microarray gene expression profile.Keywords: gene selection, feature selection, cancer classification, microarray, gene expression profile
Procedia PDF Downloads 45414308 A Descriptive Study on Comparison of Maternal and Perinatal Outcome of Twin Pregnancies Conceived Spontaneously and by Assisted Conception Methods
Authors: Aishvarya Gupta, Keerthana Anand, Sasirekha Rengaraj, Latha Chathurvedula
Abstract:
Introduction: Advances in assisted reproductive technology and increase in the proportion of infertile couples have both contributed to the steep increase in the incidence of twin pregnancies in past decades. Maternal and perinatal complications are higher in twins than in singleton pregnancies. Studies comparing the maternal and perinatal outcomes of ART twin pregnancies versus spontaneously conceived twin pregnancies report heterogeneous results making it unclear whether the complications are due to twin gestation per se or because of assisted reproductive techniques. The present study aims to compare both maternal and perinatal outcomes in twin pregnancies which are spontaneously conceived and after assisted conception methods, so that targeted steps can be undertaken in order to improve maternal and perinatal outcome of twins. Objectives: To study perinatal and maternal outcome in twin pregnancies conceived spontaneously as well as with assisted methods and compare the outcomes between the two groups. Setting: Women delivering at JIPMER (tertiary care institute), Pondicherry. Population: 380 women with twin pregnancies who delivered in JIPMER between June 2015 and March 2017 were included in the study. Methods: The study population was divided into two cohorts – one conceived by spontaneous conception and other by assisted reproductive methods. Association of various maternal and perinatal outcomes with the method of conception was assessed using chi square test or Student's t test as appropriate. Multiple logistic regression analysis was done to assess the independent association of assisted conception with maternal outcomes after adjusting for age, parity and BMI. Multiple logistic regression analysis was done to assess the independent association of assisted conception with perinatal outcomes after adjusting for age, parity, BMI, chorionicity, gestational age at delivery and presence of hypertension or gestational diabetes in the mother. A p value of < 0.05 was considered as significant. Result: There was increased proportion of women with GDM (21% v/s 4.29%) and premature rupture of membranes (35% v/s 22.85%) in the assisted conception group and more anemic women in the spontaneous group (71.27% v/s 55.1%). However assisted conception per se increased the incidence of GDM among twin gestations (OR 3.39, 95% CI 1.34 – 8.61) and did not influence any of the other maternal outcomes. Among the perinatal outcomes, assisted conception per se increased the risk of having very preterm (<32 weeks) neonates (OR 3.013, 95% CI 1.432 – 6.337). The mean birth weight did not significantly differ between the two groups (p = 0.429). Though there were higher proportion of babies admitted to NICU in the assisted conception group (48.48% v/s 36.43%), assisted conception per se did not increase the risk of admission to NICU (OR 1.23, 95% CI 0.76 – 1.98). There was no significant difference in perinatal mortality rates between the two groups (p = 0.829). Conclusion: Assisted conception per se increases the risk of developing GDM in women with twin gestation and increases the risk of delivering very preterm babies. Hence measures should be taken to ensure appropriate screening methods for GDM and suitable neonatal care in such pregnancies.Keywords: assisted conception, maternal outcomes, perinatal outcomes, twin gestation
Procedia PDF Downloads 21014307 A New Approach to Image Stitching of Radiographic Images
Authors: Somaya Adwan, Rasha Majed, Lamya'a Majed, Hamzah Arof
Abstract:
In order to produce images with whole body parts, X-ray of different portions of the body parts is assembled using image stitching methods. A new method for image stitching that exploits mutually feature based method and direct based method to identify and merge pairs of X-ray medical images is presented in this paper. The performance of the proposed method based on this hybrid approach is investigated in this paper. The ability of the proposed method to stitch and merge the overlapping pairs of images is demonstrated. Our proposed method display comparable if not superior performance to other feature based methods that are mentioned in the literature on the standard databases. These results are promising and demonstrate the potential of the proposed method for further development to tackle more advanced stitching problems.Keywords: image stitching, direct based method, panoramic image, X-ray
Procedia PDF Downloads 54114306 Behaviour of Non-local Correlations and Quantum Information Theoretic Measures in Frustrated Molecular Wheels
Authors: Amit Tribedi
Abstract:
Genuine Quantumness present in Quantum Systems is the resource for implementing Quantum Information and Computation Protocols which can outperform the classical counterparts. These Quantumness measures encompass non-local ones known as quantum entanglement (QE) and quantum information theoretic (QIT) ones, e.g. Quantum Discord (QD). In this paper, some well-known measures of QE and QD in some wheel-like frustrated molecular magnetic systems have been studied. One of the systems has already been synthesized using coordination chemistry, and the other is hypothetical, where the dominant interaction is the spin-spin exchange interaction. Exact analytical methods and exact numerical diagonalization methods have been used. Some counter-intuitive non-trivial features, like non-monotonicity of quantum correlations with temperature, persistence of multipartite entanglement over bipartite ones etc. indicated by the behaviour of the correlations and the QIT measures have been found. The measures, being operational ones, can be used to realize the resource of Quantumness in experiments.Keywords: 0D Magnets, discord, entanglement, frustration
Procedia PDF Downloads 22814305 Slope Stability and Landslides Hazard Analysis, Limitations of Existing Approaches, and a New Direction
Authors: Alisawi Alaa T., Collins P. E. F.
Abstract:
The analysis and evaluation of slope stability and landslide hazards are landslide hazards are critically important in civil engineering projects and broader considerations of safety. The level of slope stability risk should be identified due to its significant and direct financial and safety effects. Slope stability hazard analysis is performed considering static and/or dynamic loading circumstances. To reduce and/or prevent the failure hazard caused by landslides, a sophisticated and practical hazard analysis method using advanced constitutive modeling should be developed and linked to an effective solution that corresponds to the specific type of slope stability and landslides failure risk. Previous studies on slope stability analysis methods identify the failure mechanism and its corresponding solution. The commonly used approaches include used approaches include limit equilibrium methods, empirical approaches for rock slopes (e.g., slope mass rating and Q-slope), finite element or finite difference methods, and district element codes. This study presents an overview and evaluation of these analysis techniques. Contemporary source materials are used to examine these various methods on the basis of hypotheses, the factor of safety estimation, soil types, load conditions, and analysis conditions and limitations. Limit equilibrium methods play a key role in assessing the level of slope stability hazard. The slope stability safety level can be defined by identifying the equilibrium of the shear stress and shear strength. The slope is considered stable when the movement resistance forces are greater than those that drive the movement with a factor of safety (ratio of the resistance of the resistance of the driving forces) that is greater than 1.00. However, popular and practical methods, including limit equilibrium approaches, are not effective when the slope experiences complex failure mechanisms, such as progressive failure, liquefaction, internal deformation, or creep. The present study represents the first episode of an ongoing project that involves the identification of the types of landslides hazards, assessment of the level of slope stability hazard, development of a sophisticated and practical hazard analysis method, linkage of the failure type of specific landslides conditions to the appropriate solution and application of an advanced computational method for mapping the slope stability properties in the United Kingdom, and elsewhere through geographical information system (GIS) and inverse distance weighted spatial interpolation(IDW) technique. This study investigates and assesses the different assesses the different analysis and solution techniques to enhance the knowledge on the mechanism of slope stability and landslides hazard analysis and determine the available solutions for each potential landslide failure risk.Keywords: slope stability, finite element analysis, hazard analysis, landslides hazard
Procedia PDF Downloads 10014304 Mathematical Modeling for Diabetes Prediction: A Neuro-Fuzzy Approach
Authors: Vijay Kr. Yadav, Nilam Rathi
Abstract:
Accurate prediction of glucose level for diabetes mellitus is required to avoid affecting the functioning of major organs of human body. This study describes the fundamental assumptions and two different methodologies of the Blood glucose prediction. First is based on the back-propagation algorithm of Artificial Neural Network (ANN), and second is based on the Neuro-Fuzzy technique, called Fuzzy Inference System (FIS). Errors between proposed methods further discussed through various statistical methods such as mean square error (MSE), normalised mean absolute error (NMAE). The main objective of present study is to develop mathematical model for blood glucose prediction before 12 hours advanced using data set of three patients for 60 days. The comparative studies of the accuracy with other existing models are also made with same data set.Keywords: back-propagation, diabetes mellitus, fuzzy inference system, neuro-fuzzy
Procedia PDF Downloads 25714303 Parameters Tuning of a PID Controller on a DC Motor Using Honey Bee and Genetic Algorithms
Authors: Saeid Jalilzadeh
Abstract:
PID controllers are widely used to control the industrial plants because of their robustness and simple structures. Tuning of the controller's parameters to get a desired response is difficult and time consuming. With the development of computer technology and artificial intelligence in automatic control field, all kinds of parameters tuning methods of PID controller have emerged in endlessly, which bring much energy for the study of PID controller, but many advanced tuning methods behave not so perfect as to be expected. Honey Bee algorithm (HBA) and genetic algorithm (GA) are extensively used for real parameter optimization in diverse fields of study. This paper describes an application of HBA and GA to the problem of designing a PID controller whose parameters comprise proportionality constant, integral constant and derivative constant. Presence of three parameters to optimize makes the task of designing a PID controller more challenging than conventional P, PI, and PD controllers design. The suitability of the proposed approach has been demonstrated through computer simulation using MATLAB/SIMULINK.Keywords: controller, GA, optimization, PID, PSO
Procedia PDF Downloads 54414302 Optimized Deep Learning-Based Facial Emotion Recognition System
Authors: Erick C. Valverde, Wansu Lim
Abstract:
Facial emotion recognition (FER) system has been recently developed for more advanced computer vision applications. The ability to identify human emotions would enable smart healthcare facility to diagnose mental health illnesses (e.g., depression and stress) as well as better human social interactions with smart technologies. The FER system involves two steps: 1) face detection task and 2) facial emotion recognition task. It classifies the human expression in various categories such as angry, disgust, fear, happy, sad, surprise, and neutral. This system requires intensive research to address issues with human diversity, various unique human expressions, and variety of human facial features due to age differences. These issues generally affect the ability of the FER system to detect human emotions with high accuracy. Early stage of FER systems used simple supervised classification task algorithms like K-nearest neighbors (KNN) and artificial neural networks (ANN). These conventional FER systems have issues with low accuracy due to its inefficiency to extract significant features of several human emotions. To increase the accuracy of FER systems, deep learning (DL)-based methods, like convolutional neural networks (CNN), are proposed. These methods can find more complex features in the human face by means of the deeper connections within its architectures. However, the inference speed and computational costs of a DL-based FER system is often disregarded in exchange for higher accuracy results. To cope with this drawback, an optimized DL-based FER system is proposed in this study.An extreme version of Inception V3, known as Xception model, is leveraged by applying different network optimization methods. Specifically, network pruning and quantization are used to enable lower computational costs and reduce memory usage, respectively. To support low resource requirements, a 68-landmark face detector from Dlib is used in the early step of the FER system.Furthermore, a DL compiler is utilized to incorporate advanced optimization techniques to the Xception model to improve the inference speed of the FER system. In comparison to VGG-Net and ResNet50, the proposed optimized DL-based FER system experimentally demonstrates the objectives of the network optimization methods used. As a result, the proposed approach can be used to create an efficient and real-time FER system.Keywords: deep learning, face detection, facial emotion recognition, network optimization methods
Procedia PDF Downloads 11814301 Sensory Gap Analysis on Port Wine Promotion and Perceptions
Authors: José Manue Carvalho Vieira, Mariana Magalhães, Elizabeth Serra
Abstract:
The Port Wine industry is essential to Portugal because it carries a tangible cultural heritage and for social and economic reasons. Positioned as a luxury product, brands need to pay more attention to the new generation's habits, preferences, languages, and sensory perceptions. Healthy lifestyles, anti-alcohol campaigns, and digitalisation of their buying decision process need to be better understood to understand the wine market in the future. The purpose of this study is to clarify the sensory perception gap between Port Wine descriptors promotion and the new generation's perceptions to help wineries to align their strategies. Based on the interpretivist approach - multiple methods and techniques (mixed-methods), different world views and different assumptions, and different data collection methods and analysis, this research integrated qualitative semi-structured interviews, Port Wine promotion contents, and social media perceptions mined by Sentiment Analysis Enginius algorithm. Findings confirm that Port Wine CEOs' strategies, brands' promotional content, and social perceptions are not sufficiently aligned. The central insight for Port Wine brands' managers is that there is a long and continuous work of understanding and associating their descriptors with the most relevant perceptual values and criteria of their targets to reposition (when necessary) and sustainably revitalise their brands. Finally, this study hypothesised a sensory gap that leads to a decrease in consumption, trying to find recommendations on how to transform it into an advantage for a better attraction towards the young age group (18-25).Keywords: port wine, consumer habits, sensory gap analysis, wine marketing
Procedia PDF Downloads 24614300 A Conceptual E-Business Model and the Effect of Strategic Planning Parameters on E-Business Strategy Management and Performance
Authors: Alexandra Lipitakis, Evangelia A. E. C. Lipitakis
Abstract:
In this article, a class of e-business strategy planning parameters are introduced and their effect on financial and non-financial performance of e-businesses and organizations is investigated. The relationships between these strategic planning parameters, i.e. Formality, Participation, Sophistication, Thoroughness, Synergy and Cooperation, Entropic Factor, Adaptivity, Uncertainty and Financial and Non-Financial Performance are examined and the directions of these relationships are given. A conceptual model has been constructed and quantitative research methods can be used to test the considered eight hypotheses. In the framework of e-business strategy planning this research study clearly demonstrates how strategic planning components have positive relationships with e-business strategy management and performance.Keywords: e-business management, e-business model, e-business performance assessments, strategy management methodologies, strategy planning, quantitative methods
Procedia PDF Downloads 39014299 Industrial Wastewater Sludge Treatment in Chongqing, China
Authors: Victor Emery David Jr., Jiang Wenchao, Yasinta John, Md. Sahadat Hossain
Abstract:
Sludge originates from the process of treatment of wastewater. It is the byproduct of wastewater treatment containing concentrated heavy metals and poorly biodegradable trace organic compounds, as well as potentially pathogenic organisms (viruses, bacteria, etc.) which are usually difficult to treat or dispose of. China, like other countries, is no stranger to the challenges posed by an increase of wastewater. Treatment and disposal of sludge have been a problem for most cities in China. However, this problem has been exacerbated by other issues such as lack of technology, funding, and other factors. Suitable methods for such climatic conditions are still unavailable for modern cities in China. Against this background, this paper seeks to describe the methods used for treatment and disposal of sludge from industries and suggest a suitable method for treatment and disposal in Chongqing/China. From the research conducted, it was discovered that the highest treatment rate of sludge in Chongqing was 10.08%. The industrial waste piping system is not separated from the domestic system. Considering the proliferation of industry and urbanization, there is a likelihood that the production of sludge in Chongqing will increase. If the sludge produced is not properly managed, this may lead to adverse health and environmental effects. Disposal costs and methods for Chongqing were also included in this paper’s analysis. Research showed that incineration is the most expensive method of sludge disposal in China/Chongqing. Subsequent research, therefore, considered optional alternatives such as composting. Composting represents a relatively cheap waste disposal method considering the vast population, current technology and economic conditions of Chongqing, as well as China at large.Keywords: Chongqing/China, disposal, industrial, sludge, treatment
Procedia PDF Downloads 32114298 Long Memory and ARFIMA Modelling: The Case of CPI Inflation for Ghana and South Africa
Authors: A. Boateng, La Gil-Alana, M. Lesaoana; Hj. Siweya, A. Belete
Abstract:
This study examines long memory or long-range dependence in the CPI inflation rates of Ghana and South Africa using Whittle methods and autoregressive fractionally integrated moving average (ARFIMA) models. Standard I(0)/I(1) methods such as Augmented Dickey-Fuller (ADF), Philips-Perron (PP) and Kwiatkowski–Phillips–Schmidt–Shin (KPSS) tests were also employed. Our findings indicate that long memory exists in the CPI inflation rates of both countries. After processing fractional differencing and determining the short memory components, the models were specified as ARFIMA (4,0.35,2) and ARFIMA (3,0.49,3) respectively for Ghana and South Africa. Consequently, the CPI inflation rates of both countries are fractionally integrated and mean reverting. The implication of this result will assist in policy formulation and identification of inflationary pressures in an economy.Keywords: Consumer Price Index (CPI) inflation rates, Whittle method, long memory, ARFIMA model
Procedia PDF Downloads 36914297 Analysis of Formation Methods of Range Profiles for an X-Band Coastal Surveillance Radar
Authors: Nguyen Van Loi, Le Thanh Son, Tran Trung Kien
Abstract:
The paper deals with the problem of the formation of range profiles (RPs) for an X-band coastal surveillance radar. Two popular methods, the difference operator method, and the window-based method, are reviewed and analyzed via two tests with different datasets. The test results show that although the original window-based method achieves a better performance than the difference operator method, it has three main drawbacks that are the use of 3 or 4 peaks of an RP for creating the windows, the extension of the window size using the power sum of three adjacent cells in the left and the right sides of the windows and the same threshold applied for all types of vessels to finish the formation process of RPs. These drawbacks lead to inaccurate RPs due to the low signal-to-clutter ratio. Therefore, some suggestions are proposed to improve the original window-based method.Keywords: range profile, difference operator method, window-based method, automatic target recognition
Procedia PDF Downloads 12714296 Evaluating Urban City Indices: A Study for Investigating Functional Domains, Indicators and Integration Methods
Authors: Fatih Gundogan, Fatih Kafali, Abdullah Karadag, Alper Baloglu, Ersoy Pehlivan, Mustafa Eruyar, Osman Bayram, Orhan Karademiroglu, Wasim Shoman
Abstract:
Nowadays many cities around the world are investing their efforts and resources for the purpose of facilitating their citizen’s life and making cities more livable and sustainable by implementing newly emerged phenomena of smart city. For this purpose, related research institutions prepare and publish smart city indices or benchmarking reports aiming to measure the city’s current ‘smartness’ status. Several functional domains, various indicators along different selection and calculation methods are found within such indices and reports. The selection criteria varied for each institution resulting in inconsistency in the ranking and evaluating. This research aims to evaluate the impact of selecting such functional domains, indicators and calculation methods which may cause change in the rank. For that, six functional domains, i.e. Environment, Mobility, Economy, People, Living and governance, were selected covering 19 focus areas and 41 sub-focus (variable) areas. 60 out of 191 indicators were also selected according to several criteria. These were identified as a result of extensive literature review for 13 well known global indices and research and the ISO 37120 standards of sustainable development of communities. The values of the identified indicators were obtained from reliable sources for ten cities. The values of each indicator for the selected cities were normalized and standardized to objectively investigate the impact of the chosen indicators. Moreover, the effect of choosing an integration method to represent the values of indicators for each city is investigated by comparing the results of two of the most used methods i.e. geometric aggregation and fuzzy logic. The essence of these methods is assigning a weight to each indicator its relative significance. However, both methods resulted in different weights for the same indicator. As a result of this study, the alternation in city ranking resulting from each method was investigated and discussed separately. Generally, each method illustrated different ranking for the selected cities. However, it was observed that within certain functional areas the rank remained unchanged in both integration method. Based on the results of the study, it is recommended utilizing a common platform and method to objectively evaluate cities around the world. The common method should provide policymakers proper tools to evaluate their decisions and investments relative to other cities. Moreover, for smart cities indices, at least 481 different indicators were found, which is an immense number of indicators to be considered, especially for a smart city index. Further works should be devoted to finding mutual indicators representing the index purpose globally and objectively.Keywords: functional domain, urban city index, indicator, smart city
Procedia PDF Downloads 14714295 Patient-Specific Modeling Algorithm for Medical Data Based on AUC
Authors: Guilherme Ribeiro, Alexandre Oliveira, Antonio Ferreira, Shyam Visweswaran, Gregory Cooper
Abstract:
Patient-specific models are instance-based learning algorithms that take advantage of the particular features of the patient case at hand to predict an outcome. We introduce two patient-specific algorithms based on decision tree paradigm that use AUC as a metric to select an attribute. We apply the patient specific algorithms to predict outcomes in several datasets, including medical datasets. Compared to the patient-specific decision path (PSDP) entropy-based and CART methods, the AUC-based patient-specific decision path models performed equivalently on area under the ROC curve (AUC). Our results provide support for patient-specific methods being a promising approach for making clinical predictions.Keywords: approach instance-based, area under the ROC curve, patient-specific decision path, clinical predictions
Procedia PDF Downloads 47914294 General Mathematical Framework for Analysis of Cattle Farm System
Authors: Krzysztof Pomorski
Abstract:
In the given work we present universal mathematical framework for modeling of cattle farm system that can set and validate various hypothesis that can be tested against experimental data. The presented work is preliminary but it is expected to be valid tool for future deeper analysis that can result in new class of prediction methods allowing early detection of cow dieseaes as well as cow performance. Therefore the presented work shall have its meaning in agriculture models and in machine learning as well. It also opens the possibilities for incorporation of certain class of biological models necessary in modeling of cow behavior and farm performance that might include the impact of environment on the farm system. Particular attention is paid to the model of coupled oscillators that it the basic building hypothesis that can construct the model showing certain periodic or quasiperiodic behavior.Keywords: coupled ordinary differential equations, cattle farm system, numerical methods, stochastic differential equations
Procedia PDF Downloads 14514293 The Network Relative Model Accuracy (NeRMA) Score: A Method to Quantify the Accuracy of Prediction Models in a Concurrent External Validation
Authors: Carl van Walraven, Meltem Tuna
Abstract:
Background: Network meta-analysis (NMA) quantifies the relative efficacy of 3 or more interventions from studies containing a subgroup of interventions. This study applied the analytical approach of NMA to quantify the relative accuracy of prediction models with distinct inclusion criteria that are evaluated on a common population (‘concurrent external validation’). Methods: We simulated binary events in 5000 patients using a known risk function. We biased the risk function and modified its precision by pre-specified amounts to create 15 prediction models with varying accuracy and distinct patient applicability. Prediction model accuracy was measured using the Scaled Brier Score (SBS). Overall prediction model accuracy was measured using fixed-effects methods that accounted for model applicability patterns. Prediction model accuracy was summarized as the Network Relative Model Accuracy (NeRMA) Score which ranges from -∞ through 0 (accuracy of random guessing) to 1 (accuracy of most accurate model in concurrent external validation). Results: The unbiased prediction model had the highest SBS. The NeRMA score correctly ranked all simulated prediction models by the extent of bias from the known risk function. A SAS macro and R-function was created to implement the NeRMA Score. Conclusions: The NeRMA Score makes it possible to quantify the accuracy of binomial prediction models having distinct inclusion criteria in a concurrent external validation.Keywords: prediction model accuracy, scaled brier score, fixed effects methods, concurrent external validation
Procedia PDF Downloads 23614292 Designing of Food Products Enriched With Phytonutrients Assigned for Hypertension Suffering Consumers
Authors: Anna Gramza-Michałowska, Dominik Kmiecik, Justyna Bilon, Joanna Skręty, Joanna Kobus-Cisowska, Józef Korczak, Andrzej Sidor
Abstract:
Background: Hypertension is one of the civilization diseases with a global scope. Many research showed that every day diet influences significantly our health, helping with the prophylaxis and diseases treatment. The key factor here is the presence of plant origin natural bio active components. Aim: The following research describes snack health-oriented products for hypertension sufferers enriched with selected plant ingredients. Various analytical methods have been applied to determine product’s basic composition and their antioxidant activity. Methods: Snack products was formulated from a composition of different flours, oil, yeast, plant particles and extracts. Basic composition of a product was evaluated as content of protein, lipids, fiber, ash and caloricity. Antioxidant capacity of snacks was evaluated with use radical scavenging methods (DPPH, ABTS) and ORAC value. Proposed snacks as new product was also characterized with sensory analysis. Results and discussion: Results showed that addition of phyto nutrients allowed to improve nutritional and antioxidative value of examined products. Also the anti radical potential was significantly increased, with no loss of sensory value of a snacks. Conclusions: Designed snack is rich in polyphenolics, that express high antioxidant activity, helpful in hypertension and as low calories product obesity prophylaxis.Keywords: antioxidant, well-being, hypertension, bioactive compounds
Procedia PDF Downloads 49614291 Assessment of Five Photoplethysmographic Methods for Estimating Heart Rate Variability
Authors: Akshay B. Pawar, Rohit Y. Parasnis
Abstract:
Heart Rate Variability (HRV) is a widely used indicator of the regulation between the autonomic nervous system (ANS) and the cardiovascular system. Besides being non-invasive, it also has the potential to predict mortality in cases involving critical injuries. The gold standard method for determining HRV is based on the analysis of RR interval time series extracted from ECG signals. However, because it is much more convenient to obtain photoplethysmogramic (PPG) signals as compared to ECG signals (which require the attachment of several electrodes to the body), many researchers have used pulse cycle intervals instead of RR intervals to estimate HRV. They have also compared this method with the gold standard technique. Though most of their observations indicate a strong correlation between the two methods, recent studies show that in healthy subjects, except for a few parameters, the pulse-based method cannot be a surrogate for the standard RR interval- based method. Moreover, the former tends to overestimate short-term variability in heart rate. This calls for improvements in or alternatives to the pulse-cycle interval method. In this study, besides the systolic peak-peak interval method (PP method) that has been studied several times, four recent PPG-based techniques, namely the first derivative peak-peak interval method (P1D method), the second derivative peak-peak interval method (P2D method), the valley-valley interval method (VV method) and the tangent-intersection interval method (TI method) were compared with the gold standard technique. ECG and PPG signals were obtained from 10 young and healthy adults (consisting of both males and females) seated in the armchair position. In order to de-noise these signals and eliminate baseline drift, they were passed through certain digital filters. After filtering, the following HRV parameters were computed from PPG using each of the five methods and also from ECG using the gold standard method: time domain parameters (SDNN, pNN50 and RMSSD), frequency domain parameters (Very low-frequency power (VLF), Low-frequency power (LF), High-frequency power (HF) and Total power or “TP”). Besides, Poincaré plots were also plotted and their SD1/SD2 ratios determined. The resulting sets of parameters were compared with those yielded by the standard method using measures of statistical correlation (correlation coefficient) as well as statistical agreement (Bland-Altman plots). From the viewpoint of correlation, our results show that the best PPG-based methods for the determination of most parameters and Poincaré plots are the P2D method (shows more than 93% correlation with the standard method) and the PP method (mean correlation: 88%) whereas the TI, VV and P1D methods perform poorly (<70% correlation in most cases). However, our evaluation of statistical agreement using Bland-Altman plots shows that none of the five techniques agrees satisfactorily well with the gold standard method as far as time-domain parameters are concerned. In conclusion, excellent statistical correlation implies that certain PPG-based methods provide a good amount of information on the pattern of heart rate variation, whereas poor statistical agreement implies that PPG cannot completely replace ECG in the determination of HRV.Keywords: photoplethysmography, heart rate variability, correlation coefficient, Bland-Altman plot
Procedia PDF Downloads 32414290 Lake of Neuchatel: Effect of Increasing Storm Events on Littoral Transport and Coastal Structures
Authors: Charlotte Dreger, Erik Bollaert
Abstract:
This paper presents two environmentally-friendly coastal structures realized on the Lake of Neuchâtel. Both structures reflect current environmental issues of concern on the lake and have been strongly affected by extreme meteorological conditions between their period of design and their actual operational period. The Lake of Neuchatel is one of the biggest Swiss lakes and measures around 38 km in length and 8.2 km in width, for a maximum water depth of 152 m. Its particular topographical alignment, situated in between the Swiss Plateau and the Jura mountains, combines strong winds and large fetch values, resulting in significant wave heights during storm events at both north-east and south-west lake extremities. In addition, due to flooding concerns, historically, lake levels have been lowered by several meters during the Jura correction works in the 19th and 20th century. Hence, during storm events, continuous erosion of the vulnerable molasse shorelines and sand banks generate frequent and abundant littoral transport from the center of the lake to its extremities. This phenomenon does not only cause disturbances of the ecosystem, but also generates numerous problems at natural or man-made infrastructures located along the shorelines, such as reed plants, harbor entrances, canals, etc. A first example is provided at the southwestern extremity, near the city of Yverdon, where an ensemble of 11 small islands, the Iles des Vernes, have been artificially created in view of enhancing biological conditions and food availability for bird species during their migration process, replacing at the same time two larger islands that were affected by lack of morphodynamics and general vegetalization of their surfaces. The article will present the concept and dimensioning of these islands based on 2D numerical modelling, as well as the realization and follow-up campaigns. In particular, the influence of several major storm events that occurred immediately after the works will be pointed out. Second, a sediment retention dike is discussed at the northeastern extremity, at the entrance of the Canal de la Broye into the lake. This canal is heavily used for navigation and suffers from frequent and significant sedimentation at its outlet. The new coastal structure has been designed to minimize sediment deposits around the exutory of the canal into the lake, by retaining the littoral transport during storm events. The article will describe the basic assumptions used to design the dike, as well as the construction works and follow-up campaigns. Especially the huge influence of changing meteorological conditions on the littoral transport of the Lake of Neuchatel since project design ten years ago will be pointed out. Not only the intensity and frequency of storm events are increasing, but also the main wind directions alter, affecting in this way the efficiency of the coastal structure in retaining the sediments.Keywords: meteorological evolution, sediment transport, lake of Neuchatel, numerical modelling, environmental measures
Procedia PDF Downloads 8514289 Identifying Promoters and Their Types Based on a Two-Layer Approach
Authors: Bin Liu
Abstract:
Prokaryotic promoter, consisted of two short DNA sequences located at in -35 and -10 positions, is responsible for controlling the initiation and expression of gene expression. Different types of promoters have different functions, and their consensus sequences are similar. In addition, their consensus sequences may be different for the same type of promoter, which poses difficulties for promoter identification. Unfortunately, all existing computational methods treat promoter identification as a binary classification task and can only identify whether a query sequence belongs to a specific promoter type. It is desired to develop computational methods for effectively identifying promoters and their types. Here, a two-layer predictor is proposed to try to deal with the problem. The first layer is designed to predict whether a given sequence is a promoter and the second layer predicts the type of promoter that is judged as a promoter. Meanwhile, we also analyze the importance of feature and sequence conversation in two aspects: promoter identification and promoter type identification. To the best knowledge of ours, it is the first computational predictor to detect promoters and their types.Keywords: promoter, promoter type, random forest, sequence information
Procedia PDF Downloads 18414288 Prevention of Road Accidents by Computerized Drowsiness Detection System
Authors: Ujjal Chattaraj, P. C. Dasbebartta, S. Bhuyan
Abstract:
This paper aims to propose a method to detect the action of the driver’s eyes, using the concept of face detection. There are three major key contributing methods which can rapidly process the framework of the facial image and hence produce results which further can program the reactions of the vehicles as pre-programmed for the traffic safety. This paper compares and analyses the methods on the basis of their reaction time and their ability to deal with fluctuating images of the driver. The program used in this study is simple and efficient, built using the AdaBoost learning algorithm. Through this program, the system would be able to discard background regions and focus on the face-like regions. The results are analyzed on a common computer which makes it feasible for the end users. The application domain of this experiment is quite wide, such as detection of drowsiness or influence of alcohols in drivers or detection for the case of identification.Keywords: AdaBoost learning algorithm, face detection, framework, traffic safety
Procedia PDF Downloads 15714287 The Play Translator’s Score Developing: Methodology for Intercultural Communication
Authors: Akhmylovskaia Larisa, Barysh Andriana
Abstract:
The present paper is introducing the translation score developing methodology and methods in the cross-cultural communication. The ideas and examples presented by the authors illustrate the universal character of translation score developing methods under analysis. Personal experience in the international theatre-making projects, opera laboratories, cross-cultural master-classes, movie and theatre festivals give more opportunities to single out the conditions, forms, means and principles of translation score developing as well as the translator/interpreter’s functions as cultural liaison for multiethnic collaboration.Keywords: methodology of translation score developing, pre-production, analysis, production, post-production, ethnic scene theory, theatre anthropology, laboratory, master-class, educational project, academic project, Stanislavski terminology meta-language, super-objective, participant observation
Procedia PDF Downloads 32514286 Integration of Agile Philosophy and Scrum Framework to Missile System Design Processes
Authors: Misra Ayse Adsiz, Selim Selvi
Abstract:
In today's world, technology is competing with time. In order to catch up with the world's companies and adapt quickly to the changes, it is necessary to speed up the processes and keep pace with the rate of change of the technology. The missile system design processes, which are handled with classical methods, keep behind in this race. Because customer requirements are not clear, and demands are changing again and again in the design process. Therefore, in the system design process, a methodology suitable for the missile system design dynamics has been investigated and the processes used for catching up the era are examined. When commonly used design processes are analyzed, it is seen that any one of them is dynamic enough for today’s conditions. So a hybrid design process is established. After a detailed review of the existing processes, it is decided to focus on the Scrum Framework and Agile Philosophy. Scrum is a process framework. It is focused on to develop software and handling change management with rapid methods. In addition, agile philosophy is intended to respond quickly to changes. In this study, it is aimed to integrate Scrum framework and agile philosophy, which are the most appropriate ways for rapid production and change adaptation, into the missile system design process. With this approach, it is aimed that the design team, involved in the system design processes, is in communication with the customer and provide an iterative approach in change management. These methods, which are currently being used in the software industry, have been integrated with the product design process. A team is created for system design process. The roles of Scrum Team are realized with including the customer. A scrum team consists of the product owner, development team and scrum master. Scrum events, which are short, purposeful and time-limited, are organized to serve for coordination rather than long meetings. Instead of the classic system design methods used in product development studies, a missile design is made with this blended method. With the help of this design approach, it is become easier to anticipate changing customer demands, produce quick solutions to demands and combat uncertainties in the product development process. With the feedback of the customer who included in the process, it is worked towards marketing optimization, design and financial optimization.Keywords: agile, design, missile, scrum
Procedia PDF Downloads 16814285 Assessment of Sustainability Initiatives at Applied Science University in Bahrain
Authors: Bayan Ahmed Alsaffar
Abstract:
The aim of this study is to assess the sustainability initiatives at Applied Sciences University (ASU) in Bahrain using a mixed-methods approach based on students, staff, and faculty perceptions. The study involves a literature review, interviews with faculty members and students, and a survey of ASU's level of sustainability in education, research, operations, administration, and finance that depended on the Sustainability Tracking, Assessment & Rating System (STARS). STARS is a tool used to evaluate the sustainability performance of higher education institutions. The study concludes that a mixed-methods approach can provide a powerful tool for assessing sustainability initiatives at ASU and ultimately lead to insights that can inform effective strategies for improving sustainability efforts. The current study contributes to the field of sustainability in universities and highlights the importance of user engagement and awareness for achieving sustainability goals.Keywords: environment, initiatives, society, sustainability, STARS, university
Procedia PDF Downloads 9314284 Management and Agreement Protocol in Computer Security
Authors: Abdulameer K. Hussain
Abstract:
When dealing with a cryptographic system we note that there are many activities performed by parties of this cryptographic system and the most prominent of these activities is the process of agreement between the parties involved in the cryptographic system on how to deal and perform the cryptographic system tasks to be more secure, more confident and reliable. The most common agreement among parties is a key agreement and other types of agreements. Despite the fact that there is an attempt from some quarters to find other effective agreement methods but these methods are limited to the traditional agreements. This paper presents different parameters to perform more effectively the task of the agreement, including the key alternative, the agreement on the encryption method used and the agreement to prevent the denial of the services. To manage and achieve these goals, this method proposes the existence of an control and monitoring entity to manage these agreements by collecting different statistical information of the opinions of the authorized parties in the cryptographic system. These statistics help this entity to take the proper decision about the agreement factors. This entity is called Agreement Manager (AM).Keywords: agreement parameters, key agreement, key exchange, security management
Procedia PDF Downloads 421