Search results for: Extended Park´s vector approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 15951

Search results for: Extended Park´s vector approach

15171 Radar Fault Diagnosis Strategy Based on Deep Learning

Authors: Bin Feng, Zhulin Zong

Abstract:

Radar systems are critical in the modern military, aviation, and maritime operations, and their proper functioning is essential for the success of these operations. However, due to the complexity and sensitivity of radar systems, they are susceptible to various faults that can significantly affect their performance. Traditional radar fault diagnosis strategies rely on expert knowledge and rule-based approaches, which are often limited in effectiveness and require a lot of time and resources. Deep learning has recently emerged as a promising approach for fault diagnosis due to its ability to learn features and patterns from large amounts of data automatically. In this paper, we propose a radar fault diagnosis strategy based on deep learning that can accurately identify and classify faults in radar systems. Our approach uses convolutional neural networks (CNN) to extract features from radar signals and fault classify the features. The proposed strategy is trained and validated on a dataset of measured radar signals with various types of faults. The results show that it achieves high accuracy in fault diagnosis. To further evaluate the effectiveness of the proposed strategy, we compare it with traditional rule-based approaches and other machine learning-based methods, including decision trees, support vector machines (SVMs), and random forests. The results demonstrate that our deep learning-based approach outperforms the traditional approaches in terms of accuracy and efficiency. Finally, we discuss the potential applications and limitations of the proposed strategy, as well as future research directions. Our study highlights the importance and potential of deep learning for radar fault diagnosis. It suggests that it can be a valuable tool for improving the performance and reliability of radar systems. In summary, this paper presents a radar fault diagnosis strategy based on deep learning that achieves high accuracy and efficiency in identifying and classifying faults in radar systems. The proposed strategy has significant potential for practical applications and can pave the way for further research.

Keywords: radar system, fault diagnosis, deep learning, radar fault

Procedia PDF Downloads 74
15170 Comparison of Extended Kalman Filter and Unscented Kalman Filter for Autonomous Orbit Determination of Lagrangian Navigation Constellation

Authors: Youtao Gao, Bingyu Jin, Tanran Zhao, Bo Xu

Abstract:

The history of satellite navigation can be dated back to the 1960s. From the U.S. Transit system and the Russian Tsikada system to the modern Global Positioning System (GPS) and the Globalnaya Navigatsionnaya Sputnikovaya Sistema (GLONASS), performance of satellite navigation has been greatly improved. Nowadays, the navigation accuracy and coverage of these existing systems have already fully fulfilled the requirement of near-Earth users, but these systems are still beyond the reach of deep space targets. Due to the renewed interest in space exploration, a novel high-precision satellite navigation system is becoming even more important. The increasing demand for such a deep space navigation system has contributed to the emergence of a variety of new constellation architectures, such as the Lunar Global Positioning System. Apart from a Walker constellation which is similar to the one adopted by GPS on Earth, a novel constellation architecture which consists of libration point satellites in the Earth-Moon system is also available to construct the lunar navigation system, which can be called accordingly, the libration point satellite navigation system. The concept of using Earth-Moon libration point satellites for lunar navigation was first proposed by Farquhar and then followed by many other researchers. Moreover, due to the special characteristics of Libration point orbits, an autonomous orbit determination technique, which is called ‘Liaison navigation’, can be adopted by the libration point satellites. Using only scalar satellite-to-satellite tracking data, both the orbits of the user and libration point satellites can be determined autonomously. In this way, the extensive Earth-based tracking measurement can be eliminated, and an autonomous satellite navigation system can be developed for future space exploration missions. The method of state estimate is an unnegligible factor which impacts on the orbit determination accuracy besides type of orbit, initial state accuracy and measurement accuracy. We apply the extended Kalman filter(EKF) and the unscented Kalman filter(UKF) to determinate the orbits of Lagrangian navigation satellites. The autonomous orbit determination errors are compared. The simulation results illustrate that UKF can improve the accuracy and z-axis convergence to some extent.

Keywords: extended Kalman filter, autonomous orbit determination, unscented Kalman filter, navigation constellation

Procedia PDF Downloads 270
15169 Small Target Recognition Based on Trajectory Information

Authors: Saad Alkentar, Abdulkareem Assalem

Abstract:

Recognizing small targets has always posed a significant challenge in image analysis. Over long distances, the image signal-to-noise ratio tends to be low, limiting the amount of useful information available to detection systems. Consequently, visual target recognition becomes an intricate task to tackle. In this study, we introduce a Track Before Detect (TBD) approach that leverages target trajectory information (coordinates) to effectively distinguish between noise and potential targets. By reframing the problem as a multivariate time series classification, we have achieved remarkable results. Specifically, our TBD method achieves an impressive 97% accuracy in separating target signals from noise within a mere half-second time span (consisting of 10 data points). Furthermore, when classifying the identified targets into our predefined categories—airplane, drone, and bird—we achieve an outstanding classification accuracy of 96% over a more extended period of 1.5 seconds (comprising 30 data points).

Keywords: small targets, drones, trajectory information, TBD, multivariate time series

Procedia PDF Downloads 32
15168 Facility Anomaly Detection with Gaussian Mixture Model

Authors: Sunghoon Park, Hank Kim, Jinwon An, Sungzoon Cho

Abstract:

Internet of Things allows one to collect data from facilities which are then used to monitor them and even predict malfunctions in advance. Conventional quality control methods focus on setting a normal range on a sensor value defined between a lower control limit and an upper control limit, and declaring as an anomaly anything falling outside it. However, interactions among sensor values are ignored, thus leading to suboptimal performance. We propose a multivariate approach which takes into account many sensor values at the same time. In particular Gaussian Mixture Model is used which is trained to maximize likelihood value using Expectation-Maximization algorithm. The number of Gaussian component distributions is determined by Bayesian Information Criterion. The negative Log likelihood value is used as an anomaly score. The actual usage scenario goes like a following. For each instance of sensor values from a facility, an anomaly score is computed. If it is larger than a threshold, an alarm will go off and a human expert intervenes and checks the system. A real world data from Building energy system was used to test the model.

Keywords: facility anomaly detection, gaussian mixture model, anomaly score, expectation maximization algorithm

Procedia PDF Downloads 262
15167 A Comparative Study on ANN, ANFIS and SVM Methods for Computing Resonant Frequency of A-Shaped Compact Microstrip Antennas

Authors: Ahmet Kayabasi, Ali Akdagli

Abstract:

In this study, three robust predicting methods, namely artificial neural network (ANN), adaptive neuro fuzzy inference system (ANFIS) and support vector machine (SVM) were used for computing the resonant frequency of A-shaped compact microstrip antennas (ACMAs) operating at UHF band. Firstly, the resonant frequencies of 144 ACMAs with various dimensions and electrical parameters were simulated with the help of IE3D™ based on method of moment (MoM). The ANN, ANFIS and SVM models for computing the resonant frequency were then built by considering the simulation data. 124 simulated ACMAs were utilized for training and the remaining 20 ACMAs were used for testing the ANN, ANFIS and SVM models. The performance of the ANN, ANFIS and SVM models are compared in the training and test process. The average percentage errors (APE) regarding the computed resonant frequencies for training of the ANN, ANFIS and SVM were obtained as 0.457%, 0.399% and 0.600%, respectively. The constructed models were then tested and APE values as 0.601% for ANN, 0.744% for ANFIS and 0.623% for SVM were achieved. The results obtained here show that ANN, ANFIS and SVM methods can be successfully applied to compute the resonant frequency of ACMAs, since they are useful and versatile methods that yield accurate results.

Keywords: a-shaped compact microstrip antenna, artificial neural network (ANN), adaptive neuro-fuzzy inference system (ANFIS), support vector machine (SVM)

Procedia PDF Downloads 431
15166 Adolf Portmann: A Thinker of Self-Expressive Life

Authors: Filip Jaroš

Abstract:

The Swiss scholar Adolf Portmann (1897-1982) was an outstanding figure in the history of biology and the philosophy of the life sciences. Portmann’s biological theory is primarily focused on the problem of animal form (Gestalt), and it poses a significant counterpart to neo-Darwinian theories about the explanatory primacy of a genetic level over the outer appearance of animals. Besides that, Portmann’s morphological studies related to species-specific ontogeny and the influence of environmental surroundings can be classified as the antecedents of contemporary synthetic approaches such as “eco-evo-devo, “extended synthesis or biosemiotics. The most influential of Portmann’s concepts up to the present is his thesis of a social womb (Soziale Mutterschos): human children are born physiologically premature in comparison with other primates, and they find a second womb in a social environment nurturing their healthy development. It is during the first year of extra-uterine life when a specific human nature is formed, characterized by the strong tie between an individual and a broader historical, cultural whole. In my paper, I will closely analyze: a) the historical coordinates of Portmann’s philosophy of the life sciences (e.g., the philosophical anthropology of A. Gehlen, H. Plessner, and their concept of humans as beings “open to the world”), b) the relation of Portmann’s concept of the social womb to contemporary theories of infant birth evolution.

Keywords: adolf portmann, extended synthesis, philosophical anthropology, social womb

Procedia PDF Downloads 228
15165 Accurate Cortical Reconstruction in Narrow Sulci with Zero-Non-Zero Distance (ZNZD) Vector Field

Authors: Somojit Saha, Rohit K. Chatterjee, Sarit K. Das, Avijit Kar

Abstract:

A new force field is designed for propagation of the parametric contour into deep narrow cortical fold in the application of knowledge based reconstruction of cerebral cortex from MR image of brain. Designing of this force field is highly inspired by the Generalized Gradient Vector Flow (GGVF) model and markedly differs in manipulation of image information in order to determine the direction of propagation of the contour. While GGVF uses edge map as its main driving force, the newly designed force field uses the map of distance between zero valued pixels and their nearest non-zero valued pixel as its main driving force. Hence, it is called Zero-Non-Zero Distance (ZNZD) force field. The objective of this force field is forceful propagation of the contour beyond spurious convergence due to partial volume effect (PVE) in to narrow sulcal fold. Being function of the corresponding non-zero pixel value, the force field has got an inherent property to determine spuriousness of the edge automatically. It is effectively applied along with some morphological processing in the application of cortical reconstruction to breach the hindrance of PVE in narrow sulci where conventional GGVF fails.

Keywords: deformable model, external force field, partial volume effect, cortical reconstruction, MR image of brain

Procedia PDF Downloads 380
15164 Intrusion Detection in Computer Networks Using a Hybrid Model of Firefly and Differential Evolution Algorithms

Authors: Mohammad Besharatloo

Abstract:

Intrusion detection is an important research topic in network security because of increasing growth in the use of computer network services. Intrusion detection is done with the aim of detecting the unauthorized use or abuse in the networks and systems by the intruders. Therefore, the intrusion detection system is an efficient tool to control the user's access through some predefined regulations. Since, the data used in intrusion detection system has high dimension, a proper representation is required to show the basis structure of this data. Therefore, it is necessary to eliminate the redundant features to create the best representation subset. In the proposed method, a hybrid model of differential evolution and firefly algorithms was employed to choose the best subset of properties. In addition, decision tree and support vector machine (SVM) are adopted to determine the quality of the selected properties. In the first, the sorted population is divided into two sub-populations. These optimization algorithms were implemented on these sub-populations, respectively. Then, these sub-populations are merged to create next repetition population. The performance evaluation of the proposed method is done based on KDD Cup99. The simulation results show that the proposed method has better performance than the other methods in this context.

Keywords: intrusion detection system, differential evolution, firefly algorithm, support vector machine, decision tree

Procedia PDF Downloads 75
15163 Enhancing Sell-In and Sell-Out Forecasting Using Ensemble Machine Learning Method

Authors: Vishal Das, Tianyi Mao, Zhicheng Geng, Carmen Flores, Diego Pelloso, Fang Wang

Abstract:

Accurate sell-in and sell-out forecasting is a ubiquitous problem in the retail industry. It is an important element of any demand planning activity. As a global food and beverage company, Nestlé has hundreds of products in each geographical location that they operate in. Each product has its sell-in and sell-out time series data, which are forecasted on a weekly and monthly scale for demand and financial planning. To address this challenge, Nestlé Chilein collaboration with Amazon Machine Learning Solutions Labhas developed their in-house solution of using machine learning models for forecasting. Similar products are combined together such that there is one model for each product category. In this way, the models learn from a larger set of data, and there are fewer models to maintain. The solution is scalable to all product categories and is developed to be flexible enough to include any new product or eliminate any existing product in a product category based on requirements. We show how we can use the machine learning development environment on Amazon Web Services (AWS) to explore a set of forecasting models and create business intelligence dashboards that can be used with the existing demand planning tools in Nestlé. We explored recent deep learning networks (DNN), which show promising results for a variety of time series forecasting problems. Specifically, we used a DeepAR autoregressive model that can group similar time series together and provide robust predictions. To further enhance the accuracy of the predictions and include domain-specific knowledge, we designed an ensemble approach using DeepAR and XGBoost regression model. As part of the ensemble approach, we interlinked the sell-out and sell-in information to ensure that a future sell-out influences the current sell-in predictions. Our approach outperforms the benchmark statistical models by more than 50%. The machine learning (ML) pipeline implemented in the cloud is currently being extended for other product categories and is getting adopted by other geomarkets.

Keywords: sell-in and sell-out forecasting, demand planning, DeepAR, retail, ensemble machine learning, time-series

Procedia PDF Downloads 239
15162 Online Handwritten Character Recognition for South Indian Scripts Using Support Vector Machines

Authors: Steffy Maria Joseph, Abdu Rahiman V, Abdul Hameed K. M.

Abstract:

Online handwritten character recognition is a challenging field in Artificial Intelligence. The classification success rate of current techniques decreases when the dataset involves similarity and complexity in stroke styles, number of strokes and stroke characteristics variations. Malayalam is a complex south indian language spoken by about 35 million people especially in Kerala and Lakshadweep islands. In this paper, we consider the significant feature extraction for the similar stroke styles of Malayalam. This extracted feature set are suitable for the recognition of other handwritten south indian languages like Tamil, Telugu and Kannada. A classification scheme based on support vector machines (SVM) is proposed to improve the accuracy in classification and recognition of online malayalam handwritten characters. SVM Classifiers are the best for real world applications. The contribution of various features towards the accuracy in recognition is analysed. Performance for different kernels of SVM are also studied. A graphical user interface has developed for reading and displaying the character. Different writing styles are taken for each of the 44 alphabets. Various features are extracted and used for classification after the preprocessing of input data samples. Highest recognition accuracy of 97% is obtained experimentally at the best feature combination with polynomial kernel in SVM.

Keywords: SVM, matlab, malayalam, South Indian scripts, onlinehandwritten character recognition

Procedia PDF Downloads 562
15161 A Comparative Study of Optimization Techniques and Models to Forecasting Dengue Fever

Authors: Sudha T., Naveen C.

Abstract:

Dengue is a serious public health issue that causes significant annual economic and welfare burdens on nations. However, enhanced optimization techniques and quantitative modeling approaches can predict the incidence of dengue. By advocating for a data-driven approach, public health officials can make informed decisions, thereby improving the overall effectiveness of sudden disease outbreak control efforts. The National Oceanic and Atmospheric Administration and the Centers for Disease Control and Prevention are two of the U.S. Federal Government agencies from which this study uses environmental data. Based on environmental data that describe changes in temperature, precipitation, vegetation, and other factors known to affect dengue incidence, many predictive models are constructed that use different machine learning methods to estimate weekly dengue cases. The first step involves preparing the data, which includes handling outliers and missing values to make sure the data is prepared for subsequent processing and the creation of an accurate forecasting model. In the second phase, multiple feature selection procedures are applied using various machine learning models and optimization techniques. During the third phase of the research, machine learning models like the Huber Regressor, Support Vector Machine, Gradient Boosting Regressor (GBR), and Support Vector Regressor (SVR) are compared with several optimization techniques for feature selection, such as Harmony Search and Genetic Algorithm. In the fourth stage, the model's performance is evaluated using Mean Square Error (MSE), Mean Absolute Error (MAE), and Root Mean Square Error (RMSE) as assistance. Selecting an optimization strategy with the least number of errors, lowest price, biggest productivity, or maximum potential results is the goal. In a variety of industries, including engineering, science, management, mathematics, finance, and medicine, optimization is widely employed. An effective optimization method based on harmony search and an integrated genetic algorithm is introduced for input feature selection, and it shows an important improvement in the model's predictive accuracy. The predictive models with Huber Regressor as the foundation perform the best for optimization and also prediction.

Keywords: deep learning model, dengue fever, prediction, optimization

Procedia PDF Downloads 45
15160 A Case for Strategic Landscape Infrastructure: South Essex Estuary Park

Authors: Alexandra Steed

Abstract:

Alexandra Steed URBAN was commissioned to undertake the South Essex Green and Blue Infrastructure Study (SEGBI) on behalf of the Association of South Essex Local Authorities (ASELA): a partnership of seven neighboring councils within the Thames Estuary. Located on London’s doorstep, the 70,000-hectare region is under extraordinary pressure for regeneration, further development, and economic expansion, yet faces extreme challenges: sea-level rise and inadequate flood defenses, stormwater flooding and threatened infrastructure, loss of internationally important habitats, significant existing community deprivation, and lack of connectivity and access to green space. The brief was to embrace these challenges in the creation of a document that would form a key part of ASELA’s Joint Strategic Framework and feed into local plans and master plans. Thus, helping to tackle climate change, ecological collapse, and social inequity at a regional scale whilst creating a relationship and awareness between urban communities and the surrounding landscapes and nature. The SEGBI project applied a ‘land-based’ methodology, combined with a co-design approach involving numerous stakeholders, to explore how living infrastructure can address these significant issues, reshape future planning and development, and create thriving places for the whole community of life. It comprised three key stages, including Baseline Review; Green and Blue Infrastructure Assessment; and the final Green and Blue Infrastructure Report. The resulting proposals frame an ambitious vision for the delivery of a new regional South Essex Estuary (SEE) Park – 24,000 hectares of protected and connected landscapes. This unified parkland system will drive effective place-shaping and “leveling up” for the most deprived communities while providing large-scale nature recovery and biodiversity net gain. Comprehensive analysis and policy recommendations ensure best practices will be embedded within planning documents and decisions guiding future development. Furthermore, a Natural Capital Account was undertaken as part of the strategy showing the tremendous economic value of the natural assets. This strategy sets a pioneering precedent that demonstrates how the prioritisation of living infrastructure has the capacity to address climate change and ecological collapse, while also supporting sustainable housing, healthier communities, and resilient infrastructures. It was only achievable through a collaborative and cross-boundary approach to strategic planning and growth, with a shared vision of place, and a strong commitment to delivery. With joined-up thinking and a joined-up region, a more impactful plan for South Essex was developed that will lead to numerous environmental, social, and economic benefits across the region, and enhancing the landscape and natural environs on the periphery of one of the largest cities in the world.

Keywords: climate change, green and blue infrastructure, landscape architecture, master planning, regional planning, social equity

Procedia PDF Downloads 86
15159 Towards End-To-End Disease Prediction from Raw Metagenomic Data

Authors: Maxence Queyrel, Edi Prifti, Alexandre Templier, Jean-Daniel Zucker

Abstract:

Analysis of the human microbiome using metagenomic sequencing data has demonstrated high ability in discriminating various human diseases. Raw metagenomic sequencing data require multiple complex and computationally heavy bioinformatics steps prior to data analysis. Such data contain millions of short sequences read from the fragmented DNA sequences and stored as fastq files. Conventional processing pipelines consist in multiple steps including quality control, filtering, alignment of sequences against genomic catalogs (genes, species, taxonomic levels, functional pathways, etc.). These pipelines are complex to use, time consuming and rely on a large number of parameters that often provide variability and impact the estimation of the microbiome elements. Training Deep Neural Networks directly from raw sequencing data is a promising approach to bypass some of the challenges associated with mainstream bioinformatics pipelines. Most of these methods use the concept of word and sentence embeddings that create a meaningful and numerical representation of DNA sequences, while extracting features and reducing the dimensionality of the data. In this paper we present an end-to-end approach that classifies patients into disease groups directly from raw metagenomic reads: metagenome2vec. This approach is composed of four steps (i) generating a vocabulary of k-mers and learning their numerical embeddings; (ii) learning DNA sequence (read) embeddings; (iii) identifying the genome from which the sequence is most likely to come and (iv) training a multiple instance learning classifier which predicts the phenotype based on the vector representation of the raw data. An attention mechanism is applied in the network so that the model can be interpreted, assigning a weight to the influence of the prediction for each genome. Using two public real-life data-sets as well a simulated one, we demonstrated that this original approach reaches high performance, comparable with the state-of-the-art methods applied directly on processed data though mainstream bioinformatics workflows. These results are encouraging for this proof of concept work. We believe that with further dedication, the DNN models have the potential to surpass mainstream bioinformatics workflows in disease classification tasks.

Keywords: deep learning, disease prediction, end-to-end machine learning, metagenomics, multiple instance learning, precision medicine

Procedia PDF Downloads 113
15158 Measuring Financial Asset Return and Volatility Spillovers, with Application to Sovereign Bond, Equity, Foreign Exchange and Commodity Markets

Authors: Petra Palic, Maruska Vizek

Abstract:

We provide an in-depth analysis of interdependence of asset returns and volatilities in developed and developing countries. The analysis is split into three parts. In the first part, we use multivariate GARCH model in order to provide stylized facts on cross-market volatility spillovers. In the second part, we use a generalized vector autoregressive methodology developed by Diebold and Yilmaz (2009) in order to estimate separate measures of return spillovers and volatility spillovers among sovereign bond, equity, foreign exchange and commodity markets. In particular, our analysis is focused on cross-market return, and volatility spillovers in 19 developed and developing countries. In order to estimate named spillovers, we use daily data from 2008 to 2017. In the third part of the analysis, we use a generalized vector autoregressive framework in order to estimate total and directional volatility spillovers. We use the same daily data span for one developed and one developing country in order to characterize daily volatility spillovers across stock, bond, foreign exchange and commodities markets.

Keywords: cross-market spillovers, sovereign bond markets, equity markets, value at risk (VAR)

Procedia PDF Downloads 250
15157 Hominin Niche in the Times of Climate Change

Authors: Emilia Hunt, Sally C. Reynolds, Fiona Coward, Fabio Parracho Silva, Philip Hopley

Abstract:

Ecological niche modeling is widely used in conservation studies, but application to the extinct hominin species is a relatively new approach. Being able to understand what ecological niches were occupied by respective hominin species provides a new perspective into influences on evolutionary processes. Niche separation or overlap can tell us more about specific requirements of the species within the given timeframe. Many of the ancestral species lived through enormous climate changes: glacial and interglacial periods, changes in rainfall, leading to desertification or flooding of regions and displayed impressive levels of adaptation necessary for their survival. This paper reviews niche modeling methodologies and their application to hominin studies. Traditional conservation methods might not be directly applicable to extinct species and are not comparable to hominins. Hominin niche also includes aspects of technologies, use of fire and extended communication, which are not traditionally used in building conservation models. Future perspectives on how to improve niche modeling for extinct hominin species will be discussed.

Keywords: hominin niche, climate change, evolution, adaptation, ecological niche modelling

Procedia PDF Downloads 176
15156 The Impact of Step-By-Step Program in the Public Preschool Institutions in Kosova

Authors: Rozafa Shala

Abstract:

Development of preschool education in Kosovo has passed through several periods. The period after the 1999 war was very intensive period when preschool education started to change. Step-by-step program was one of the programs which were very well extended during the period after the 1999 war until now. The aim of this study is to present the impact of the step-by-step program in the preschool education. This research is based on the hypothesis that: Step-by-step program continues to be present with its elements, in all other programs that the teachers can use. For data collection a questionnaire is constructed which was distributed to 25 teachers of preschool education who work in public preschool institutions. All the teachers have finished the training for step by step program. To support the data from the questionnaire a focus group is also organized with whom the critical issues of the program were discussed. From the results obtained we can conclude that the step-by-step program has a very strong impact in the preschool level. Many specific elements such as: circle time, weather calendar, environment inside the class, portfolios and many other elements are present in most of the preschool classes. The teacher's approach also has many elements of the step-by-step program.

Keywords: preschool education, step-by-step program, impact, teachers

Procedia PDF Downloads 334
15155 Insecticide Resistance Detection on Dengue Vector, Aedes albopictus Obtained from Kapit, Kuching and Sibu Districts in Sarawak State, Malaysia

Authors: Koon Weng Lau, Chee Dhang Chen, Abdul Aziz Azidah, Mohd Sofian-Azirun

Abstract:

Recently, Sarawak state of Malaysia encounter an outbreak of dengue fever. Aedes albopictus has incriminated as one of the important vectors of dengue transmission. Without an effective vaccine, approaches to control or prevent dengue will be a focus on the vectors. The control of Aedes mosquitoes is still dependent on the use of chemical insecticides and insecticide resistance represents a threat to the effectiveness of vector control. This study was conducted to determine the resistance status of 11 active ingredients representing four major insecticide classes: DDT, dieldrin, malathion, fenitrothion, bendiocarb, propoxur, etofenprox, deltamethrin, lambda-cyhalothrin, cyfluthrin, and permethrin. Standard WHO test procedures were conducted to determine the insecticide susceptibility. Aedes albopictus collected from Kapit (resistance ratio, RR = 1.04–3.02), Kuching (RR = 1.17–4.61), and Sibu (RR = 1.06–3.59) exhibited low resistance toward all insecticides except dieldrin. This study reveled that dieldrin is still effective against Ae. albopictus, followed by fenitrothion, cyfluthrin, and deltamethrin. In conclusion, Ae. albopictus in Sarawak exhibited different resistance levels toward various insecticides and alternative solutions should be implemented to prevent further deterioration of the condition.

Keywords: Aedes albopictus, dengue, insecticide resistance, Malaysia

Procedia PDF Downloads 346
15154 Whether Buffer Zone Community Forests’ Benefits Are Distributed Fairly to Low-Income Users: Reflection From the Buffer Zone Community Forests in Bardia National Park, Nepal

Authors: Keshav Raj Acharya, Thakur Silwal, Neelam C. Poudyal

Abstract:

Buffer zones, the peripheral areas around the national parks and wildlife reserves, are available for the purpose of benefitting the local inhabitants by providing forest products for subsistence needs of basic forest products outside the protected areas. The forest area within the buffer zone has been managed as a buffer zone community forest (BZCF) for the last 25 years after the approval of the buffer zone management regulation 1996. With a case study of select BZCF in Bardia National Park, this study aims to analyze whether the benefit provided by BZCF is equally available to poor users among other socioeconomic classes of the users. The findings are based on the analysis of cross-sectional data involving household surveys (n=305) and key informants’ interviews (n=10) as well as office records available at different 5 buffer zone community forest user groups offices. Results indicate that despite the provisions of subsidized rates for poor; poor households were more deprived due to higher forest products price particularly, the timber price in buffer zone. Evidence also indicate that due to the increased forest coverage, the incidence of wildlife damage has also increased and impacted the poor more due to lack of land ownership as well as limited alternatives. Clear community forest management guidelines with equitable benefit sharing and compensatory mechanisms to the users of poor socioeconomic class have been identified as a solution to increase the benefit to poor users in BZCFUGs.

Keywords: crop depredation, forest products, users, wellbeing ranking

Procedia PDF Downloads 36
15153 DNA Prime/MVTT Boost Enhances Broadly Protective Immune Response against Mosaic HIV-1 Gag

Authors: Wan Liu, Haibo Wang, Cathy Huang, Zhiwu Tan, Zhiwei Chen

Abstract:

The tremendous diversity of HIV-1 has been a major challenge for an effective AIDS vaccine development. Mosaic approach presents the potential for vaccine design aiming for global protection. The mosaic antigen of HIV-1 Gag allows antigenic breadth for vaccine-elicited immune response against a wider spectrum of viral strains. However, the enhancement of immune response using vaccines is dependent on the strategy used. Heterologous prime/boost regimen has been shown to elicit high levels of immune responses. Here, we investigated whether priming using plasmid DNA with electroporation followed by boosting with the live replication-competent modified vaccinia virus vector TianTan (MVTT) combined with the mosaic antigenic sequence could elicit a greater and broader antigen-specific response against HIV-1 Gag in mice. When compared to DNA or MVTT alone, or MVTT/MVTT group, DNA/MVTT group resulted in coincidentally high frequencies of broadly reactive, Gag-specific, polyfunctional, long-lived, and cytotoxic CD8+ T cells and increased anti-Gag antibody titer. Meanwhile, the vaccination could upregulate PD-1+, and Tim-3+ CD8+ T cell, myeloid-derived suppressive cells and Treg cells to balance the stronger immune response induced. Importantly, the prime/boost vaccination could help control the EcoHIV and mesothelioma AB1-gag challenge. The stronger protective Gag-specific immunity induced by a Mosaic DNA/MVTT vaccine corroborate the promise of the mosaic approach, and the potential of two acceptably safe vectors to enhance anti-HIV immunity and cancer prevention.

Keywords: DNA/MVTT vaccine, EcoHIV, mosaic antigen, mesothelioma AB1-gag

Procedia PDF Downloads 233
15152 Training Program for Kindergarden Teachers on Learning through Project Approach

Authors: Dian Hartiningsih, Miranda Diponegoro, Evita Eddie Singgih

Abstract:

In facing the 21st century, children need to be prepared in reaching their optimum development level which encompasses all aspect of growth and to achieve the learning goals which include not only knowledge and skill, but also disposition and feeling. Teachers as the forefront of education need to be equipped with the understanding and skill of a learning method which can prepare the children to face this 21st century challenge. Project approach is an approach which utilizes active learning which is beneficial for the children. Subject to this research are kindergarten teachers at Dwi Matra Kindergarten and Kirana Preschool. This research is a quantitative research using before and after study design. The result suggest that through preliminary training program on learning with project approach, the kindergarten teachers ability to explain project approach including understanding, benefit and stages of project approach have increased significantly, the teachers ability to design learning with project approach have also improved significantly. The result of learning design that the teachers had made shows a remarkable result for the first stage of the project approach; however the second and third design result was not as optimal. Challenges faced in the research will be elaborated further in the research discussion.

Keywords: project approach, teacher training, learning method, kindergarten

Procedia PDF Downloads 322
15151 Supervised Machine Learning Approach for Studying the Effect of Different Joint Sets on Stability of Mine Pit Slopes Under the Presence of Different External Factors

Authors: Sudhir Kumar Singh, Debashish Chakravarty

Abstract:

Slope stability analysis is an important aspect in the field of geotechnical engineering. It is also important from safety, and economic point of view as any slope failure leads to loss of valuable lives and damage to property worth millions. This paper aims at mitigating the risk of slope failure by studying the effect of different joint sets on the stability of mine pit slopes under the influence of various external factors, namely degree of saturation, rainfall intensity, and seismic coefficients. Supervised machine learning approach has been utilized for making accurate and reliable predictions regarding the stability of slopes based on the value of Factor of Safety. Numerous cases have been studied for analyzing the stability of slopes using the popular Finite Element Method, and the data thus obtained has been used as training data for the supervised machine learning models. The input data has been trained on different supervised machine learning models, namely Random Forest, Decision Tree, Support vector Machine, and XGBoost. Distinct test data that is not present in training data has been used for measuring the performance and accuracy of different models. Although all models have performed well on the test dataset but Random Forest stands out from others due to its high accuracy of greater than 95%, thus helping us by providing a valuable tool at our disposition which is neither computationally expensive nor time consuming and in good accordance with the numerical analysis result.

Keywords: finite element method, geotechnical engineering, machine learning, slope stability

Procedia PDF Downloads 91
15150 Exploring the Synergistic Effects of Aerobic Exercise and Cinnamon Extract on Metabolic Markers in Insulin-Resistant Rats through Advanced Machine Learning and Deep Learning Techniques

Authors: Masoomeh Alsadat Mirshafaei

Abstract:

The present study aims to explore the effect of an 8-week aerobic training regimen combined with cinnamon extract on serum irisin and leptin levels in insulin-resistant rats. Additionally, this research leverages various machine learning (ML) and deep learning (DL) algorithms to model the complex interdependencies between exercise, nutrition, and metabolic markers, offering a groundbreaking approach to obesity and diabetes research. Forty-eight Wistar rats were selected and randomly divided into four groups: control, training, cinnamon, and training cinnamon. The training protocol was conducted over 8 weeks, with sessions 5 days a week at 75-80% VO2 max. The cinnamon and training-cinnamon groups were injected with 200 ml/kg/day of cinnamon extract. Data analysis included serum data, dietary intake, exercise intensity, and metabolic response variables, with blood samples collected 72 hours after the final training session. The dataset was analyzed using one-way ANOVA (P<0.05) and fed into various ML and DL models, including Support Vector Machines (SVM), Random Forest (RF), and Convolutional Neural Networks (CNN). Traditional statistical methods indicated that aerobic training, with and without cinnamon extract, significantly increased serum irisin and decreased leptin levels. Among the algorithms, the CNN model provided superior performance in identifying specific interactions between cinnamon extract concentration and exercise intensity, optimizing the increase in irisin and the decrease in leptin. The CNN model achieved an accuracy of 92%, outperforming the SVM (85%) and RF (88%) models in predicting the optimal conditions for metabolic marker improvements. The study demonstrated that advanced ML and DL techniques could uncover nuanced relationships and potential cellular responses to exercise and dietary supplements, which is not evident through traditional methods. These findings advocate for the integration of advanced analytical techniques in nutritional science and exercise physiology, paving the way for personalized health interventions in managing obesity and diabetes.

Keywords: aerobic training, cinnamon extract, insulin resistance, irisin, leptin, convolutional neural networks, exercise physiology, support vector machines, random forest

Procedia PDF Downloads 14
15149 Hyper Tuned RBF SVM: Approach for the Prediction of the Breast Cancer

Authors: Surita Maini, Sanjay Dhanka

Abstract:

Machine learning (ML) involves developing algorithms and statistical models that enable computers to learn and make predictions or decisions based on data without being explicitly programmed. Because of its unlimited abilities ML is gaining popularity in medical sectors; Medical Imaging, Electronic Health Records, Genomic Data Analysis, Wearable Devices, Disease Outbreak Prediction, Disease Diagnosis, etc. In the last few decades, many researchers have tried to diagnose Breast Cancer (BC) using ML, because early detection of any disease can save millions of lives. Working in this direction, the authors have proposed a hybrid ML technique RBF SVM, to predict the BC in earlier the stage. The proposed method is implemented on the Breast Cancer UCI ML dataset with 569 instances and 32 attributes. The authors recorded performance metrics of the proposed model i.e., Accuracy 98.24%, Sensitivity 98.67%, Specificity 97.43%, F1 Score 98.67%, Precision 98.67%, and run time 0.044769 seconds. The proposed method is validated by K-Fold cross-validation.

Keywords: breast cancer, support vector classifier, machine learning, hyper parameter tunning

Procedia PDF Downloads 57
15148 Wind Power Forecasting Using Echo State Networks Optimized by Big Bang-Big Crunch Algorithm

Authors: Amir Hossein Hejazi, Nima Amjady

Abstract:

In recent years, due to environmental issues traditional energy sources had been replaced by renewable ones. Wind energy as the fastest growing renewable energy shares a considerable percent of energy in power electricity markets. With this fast growth of wind energy worldwide, owners and operators of wind farms, transmission system operators, and energy traders need reliable and secure forecasts of wind energy production. In this paper, a new forecasting strategy is proposed for short-term wind power prediction based on Echo State Networks (ESN). The forecast engine utilizes state-of-the-art training process including dynamical reservoir with high capability to learn complex dynamics of wind power or wind vector signals. The study becomes more interesting by incorporating prediction of wind direction into forecast strategy. The Big Bang-Big Crunch (BB-BC) evolutionary optimization algorithm is adopted for adjusting free parameters of ESN-based forecaster. The proposed method is tested by real-world hourly data to show the efficiency of the forecasting engine for prediction of both wind vector and wind power output of aggregated wind power production.

Keywords: wind power forecasting, echo state network, big bang-big crunch, evolutionary optimization algorithm

Procedia PDF Downloads 556
15147 A 0-1 Goal Programming Approach to Optimize the Layout of Hospital Units: A Case Study in an Emergency Department in Seoul

Authors: Farhood Rismanchian, Seong Hyeon Park, Young Hoon Lee

Abstract:

This paper proposes a method to optimize the layout of an emergency department (ED) based on real executions of care processes by considering several planning objectives simultaneously. Recently, demand for healthcare services has been dramatically increased. As the demand for healthcare services increases, so do the need for new healthcare buildings as well as the need for redesign and renovating existing ones. The importance of implementation of a standard set of engineering facilities planning and design techniques has been already proved in both manufacturing and service industry with many significant functional efficiencies. However, high complexity of care processes remains a major challenge to apply these methods in healthcare environments. Process mining techniques applied in this study to tackle the problem of complexity and to enhance care process analysis. Process related information such as clinical pathways extracted from the information system of an ED. A 0-1 goal programming approach is then proposed to find a single layout that simultaneously satisfies several goals. The proposed model solved by optimization software CPLEX 12. The solution reached using the proposed method has 42.2% improvement in terms of walking distance of normal patients and 47.6% improvement in walking distance of critical patients at minimum cost of relocation. It has been observed that lots of patients must unnecessarily walk long distances during their visit to the emergency department because of an inefficient design. A carefully designed layout can significantly decrease patient walking distance and related complications.

Keywords: healthcare operation management, goal programming, facility layout problem, process mining, clinical processes

Procedia PDF Downloads 278
15146 Machine Learning Model to Predict TB Bacteria-Resistant Drugs from TB Isolates

Authors: Rosa Tsegaye Aga, Xuan Jiang, Pavel Vazquez Faci, Siqing Liu, Simon Rayner, Endalkachew Alemu, Markos Abebe

Abstract:

Tuberculosis (TB) is a major cause of disease globally. In most cases, TB is treatable and curable, but only with the proper treatment. There is a time when drug-resistant TB occurs when bacteria become resistant to the drugs that are used to treat TB. Current strategies to identify drug-resistant TB bacteria are laboratory-based, and it takes a longer time to identify the drug-resistant bacteria and treat the patient accordingly. But machine learning (ML) and data science approaches can offer new approaches to the problem. In this study, we propose to develop an ML-based model to predict the antibiotic resistance phenotypes of TB isolates in minutes and give the right treatment to the patient immediately. The study has been using the whole genome sequence (WGS) of TB isolates as training data that have been extracted from the NCBI repository and contain different countries’ samples to build the ML models. The reason that different countries’ samples have been included is to generalize the large group of TB isolates from different regions in the world. This supports the model to train different behaviors of the TB bacteria and makes the model robust. The model training has been considering three pieces of information that have been extracted from the WGS data to train the model. These are all variants that have been found within the candidate genes (F1), predetermined resistance-associated variants (F2), and only resistance-associated gene information for the particular drug. Two major datasets have been constructed using these three information. F1 and F2 information have been considered as two independent datasets, and the third information is used as a class to label the two datasets. Five machine learning algorithms have been considered to train the model. These are Support Vector Machine (SVM), Random forest (RF), Logistic regression (LR), Gradient Boosting, and Ada boost algorithms. The models have been trained on the datasets F1, F2, and F1F2 that is the F1 and the F2 dataset merged. Additionally, an ensemble approach has been used to train the model. The ensemble approach has been considered to run F1 and F2 datasets on gradient boosting algorithm and use the output as one dataset that is called F1F2 ensemble dataset and train a model using this dataset on the five algorithms. As the experiment shows, the ensemble approach model that has been trained on the Gradient Boosting algorithm outperformed the rest of the models. In conclusion, this study suggests the ensemble approach, that is, the RF + Gradient boosting model, to predict the antibiotic resistance phenotypes of TB isolates by outperforming the rest of the models.

Keywords: machine learning, MTB, WGS, drug resistant TB

Procedia PDF Downloads 36
15145 Regeneration of Geological Models Using Support Vector Machine Assisted by Principal Component Analysis

Authors: H. Jung, N. Kim, B. Kang, J. Choe

Abstract:

History matching is a crucial procedure for predicting reservoir performances and making future decisions. However, it is difficult due to uncertainties of initial reservoir models. Therefore, it is important to have reliable initial models for successful history matching of highly heterogeneous reservoirs such as channel reservoirs. In this paper, we proposed a novel scheme for regenerating geological models using support vector machine (SVM) and principal component analysis (PCA). First, we perform PCA for figuring out main geological characteristics of models. Through the procedure, permeability values of each model are transformed to new parameters by principal components, which have eigenvalues of large magnitude. Secondly, the parameters are projected into two-dimensional plane by multi-dimensional scaling (MDS) based on Euclidean distances. Finally, we train an SVM classifier using 20% models which show the most similar or dissimilar well oil production rates (WOPR) with the true values (10% for each). Then, the other 80% models are classified by trained SVM. We select models on side of low WOPR errors. One hundred channel reservoir models are initially generated by single normal equation simulation. By repeating the classification process, we can select models which have similar geological trend with the true reservoir model. The average field of the selected models is utilized as a probability map for regeneration. Newly generated models can preserve correct channel features and exclude wrong geological properties maintaining suitable uncertainty ranges. History matching with the initial models cannot provide trustworthy results. It fails to find out correct geological features of the true model. However, history matching with the regenerated ensemble offers reliable characterization results by figuring out proper channel trend. Furthermore, it gives dependable prediction of future performances with reduced uncertainties. We propose a novel classification scheme which integrates PCA, MDS, and SVM for regenerating reservoir models. The scheme can easily sort out reliable models which have similar channel trend with the reference in lowered dimension space.

Keywords: history matching, principal component analysis, reservoir modelling, support vector machine

Procedia PDF Downloads 148
15144 A Look at the Quantum Theory of Atoms in Molecules from the Discrete Morse Theory

Authors: Dairo Jose Hernandez Paez

Abstract:

The quantum theory of atoms in molecules (QTAIM) allows us to obtain topological information on electronic density in quantum mechanical systems. The QTAIM starts by considering the electron density as a continuous mathematical object. On the other hand, the discretization of electron density is also a mathematical object, which, from discrete mathematics, would allow a new approach to its topological study. From this point of view, it is necessary to develop a series of steps that provide the theoretical support that guarantees its application. Some of the steps that we consider most important are mentioned below: (1) obtain good representations of the electron density through computational calculations, (2) design a methodology for the discretization of electron density, and construct the simplicial complex. (3) Make an analysis of the discrete vector field associating the simplicial complex. (4) Finally, in this research, we propose to use the discrete Morse theory as a mathematical tool to carry out studies of electron density topology.

Keywords: discrete mathematics, Discrete Morse theory, electronic density, computational calculations

Procedia PDF Downloads 90
15143 A Targeted Maximum Likelihood Estimation for a Non-Binary Causal Variable: An Application

Authors: Mohamed Raouf Benmakrelouf, Joseph Rynkiewicz

Abstract:

Targeted maximum likelihood estimation (TMLE) is well-established method for causal effect estimation with desirable statistical properties. TMLE is a doubly robust maximum likelihood based approach that includes a secondary targeting step that optimizes the target statistical parameter. A causal interpretation of the statistical parameter requires assumptions of the Rubin causal framework. The causal effect of binary variable, E, on outcomes, Y, is defined in terms of comparisons between two potential outcomes as E[YE=1 − YE=0]. Our aim in this paper is to present an adaptation of TMLE methodology to estimate the causal effect of a non-binary categorical variable, providing a large application. We propose coding on the initial data in order to operate a binarization of the interest variable. For each category, we get a transformation of the non-binary interest variable into a binary variable, taking value 1 to indicate the presence of category (or group of categories) for an individual, 0 otherwise. Such a dummy variable makes it possible to have a pair of potential outcomes and oppose a category (or a group of categories) to another category (or a group of categories). Let E be a non-binary interest variable. We propose a complete disjunctive coding of our variable E. We transform the initial variable to obtain a set of binary vectors (dummy variables), E = (Ee : e ∈ {1, ..., |E|}), where each vector (variable), Ee, takes the value of 0 when its category is not present, and the value of 1 when its category is present, which allows to compute a pairwise-TMLE comparing difference in the outcome between one category and all remaining categories. In order to illustrate the application of our strategy, first, we present the implementation of TMLE to estimate the causal effect of non-binary variable on outcome using simulated data. Secondly, we apply our TMLE adaptation to survey data from the French Political Barometer (CEVIPOF), to estimate the causal effect of education level (A five-level variable) on a potential vote in favor of the French extreme right candidate Jean-Marie Le Pen. Counterfactual reasoning requires us to consider some causal questions (additional causal assumptions). Leading to different coding of E, as a set of binary vectors, E = (Ee : e ∈ {2, ..., |E|}), where each vector (variable), Ee, takes the value of 0 when the first category (reference category) is present, and the value of 1 when its category is present, which allows to apply a pairwise-TMLE comparing difference in the outcome between the first level (fixed) and each remaining level. We confirmed that the increase in the level of education decreases the voting rate for the extreme right party.

Keywords: statistical inference, causal inference, super learning, targeted maximum likelihood estimation

Procedia PDF Downloads 87
15142 Expression of Human Papillomavirus Type 18 L1 Virus-Like Particles in Methylotropic Yeast, Pichia Pastoris

Authors: Hossein Rassi, Marjan Moradi Fard, Samaneh Niko

Abstract:

Human papillomavirus type 16 and 18 are closely associated with the development of human cervical carcinoma, which is one of the most common causes of cancer death in women worldwide. At present, HPV type 18 accounts for about 34 % of all HPV infections in Iran and the most promising vaccine against HPV infection is based on the L1 major capsid protein. The L1 protein of HPV18 has the capacity to self-assemble into capsomers or virus-like particles (VLPs) that are non-infectious, highly immunogenic and allowing their use in vaccine production. The methylotrophic yeast Pichia pastoris is an efficient and inexpensive expression system used to produce high levels of heterologous proteins. In this study we expressed HPV18 L1 VLPs in P. pastoris. The gene encoding the major capsid protein L1 of the high-risk HPV type 18 was isolated from Iranian patient by PCR and inserted into pTG19-T vector to obtain the recombinant expression vector pTG19-HPV18-L1. Then, the pTG19-HPV18-L1 was transformed into E. coli strain DH5α and the recombinant protein HPV18 L1 was expressed under IPTG induction in soluble form. The HPV18 L1 gene was excised from recombinant plasmid with XhoI and EcoRI enzymes and ligated into the yeast expression vector pPICZα linearized with the same enzymes, and transformed into P. pastoris. Induction and expression of HPV18 L1 protein was demonstrated by BMGY/BMMY and RT PCR. The parameters for induced cultivation for strain in P. pastoris KM71 with HPV16L1 were investigated in shaking flask cultures. After induced cultivation BMMY (pH 7.0) medium supplemented with methanol to a final concentration of 1.0% every 24 h at 37 degrees C for 96 h, the recombinant produced 78.6 mg/L of L1 protein. This work offers the possibility for the production of prophylactic vaccine for cervical carcinoma by P. pastoris for HPV-18 L1 gene. The VLP-based HPV vaccines can prevent persistent HPV18 infections and cervical cancer in Iran. The HPV-18 L1 gene was expressed successfully in E.coli, which provides necessary basis for preparing HPV-18 L1 vaccine in human. Also, HPV type 6 L1 proteins expressed in Pichia pastoris will facilitate the HPV vaccine development and structure-function study.

Keywords: Pichia pastoris, L1 virus-like particles, human papillomavirus type 18, biotechnology

Procedia PDF Downloads 397