Search results for: Extended Park´s vector approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 16480

Search results for: Extended Park´s vector approach

15700 Enhancing Communicative Skills for Students in Automatics

Authors: Adrian Florin Busu

Abstract:

The communicative approach, or communicative language teaching, used for enhancing communicative skills in students in automatics is a modern teaching approach based on the concept of learning a language through having to communicate real meaning. In the communicative approach, real communication is both the objective of learning and the means through which it takes place. This approach was initiated during the 1970’s and quickly became prominent, as it proposed an alternative to the previous systems-oriented approaches. In other words, instead of focusing on the acquisition of grammar and vocabulary, the communicative approach aims at developing students’ competence to communicate in the target language with an enhanced focus on real-life situations. To put it in an nutshell, CLT considers using the language to be just as important as actually learning the language.

Keywords: communication, approach, objective, learning

Procedia PDF Downloads 161
15699 Accurate Cortical Reconstruction in Narrow Sulci with Zero-Non-Zero Distance (ZNZD) Vector Field

Authors: Somojit Saha, Rohit K. Chatterjee, Sarit K. Das, Avijit Kar

Abstract:

A new force field is designed for propagation of the parametric contour into deep narrow cortical fold in the application of knowledge based reconstruction of cerebral cortex from MR image of brain. Designing of this force field is highly inspired by the Generalized Gradient Vector Flow (GGVF) model and markedly differs in manipulation of image information in order to determine the direction of propagation of the contour. While GGVF uses edge map as its main driving force, the newly designed force field uses the map of distance between zero valued pixels and their nearest non-zero valued pixel as its main driving force. Hence, it is called Zero-Non-Zero Distance (ZNZD) force field. The objective of this force field is forceful propagation of the contour beyond spurious convergence due to partial volume effect (PVE) in to narrow sulcal fold. Being function of the corresponding non-zero pixel value, the force field has got an inherent property to determine spuriousness of the edge automatically. It is effectively applied along with some morphological processing in the application of cortical reconstruction to breach the hindrance of PVE in narrow sulci where conventional GGVF fails.

Keywords: deformable model, external force field, partial volume effect, cortical reconstruction, MR image of brain

Procedia PDF Downloads 398
15698 Detection of Powdery Mildew Disease in Strawberry Using Image Texture and Supervised Classifiers

Authors: Sultan Mahmud, Qamar Zaman, Travis Esau, Young Chang

Abstract:

Strawberry powdery mildew (PM) is a serious disease that has a significant impact on strawberry production. Field scouting is still a major way to find PM disease, which is not only labor intensive but also almost impossible to monitor disease severity. To reduce the loss caused by PM disease and achieve faster automatic detection of the disease, this paper proposes an approach for detection of the disease, based on image texture and classified with support vector machines (SVMs) and k-nearest neighbors (kNNs). The methodology of the proposed study is based on image processing which is composed of five main steps including image acquisition, pre-processing, segmentation, features extraction and classification. Two strawberry fields were used in this study. Images of healthy leaves and leaves infected with PM (Sphaerotheca macularis) disease under artificial cloud lighting condition. Colour thresholding was utilized to segment all images before textural analysis. Colour co-occurrence matrix (CCM) was introduced for extraction of textural features. Forty textural features, related to a physiological parameter of leaves were extracted from CCM of National television system committee (NTSC) luminance, hue, saturation and intensity (HSI) images. The normalized feature data were utilized for training and validation, respectively, using developed classifiers. The classifiers have experimented with internal, external and cross-validations. The best classifier was selected based on their performance and accuracy. Experimental results suggested that SVMs classifier showed 98.33%, 85.33%, 87.33%, 93.33% and 95.0% of accuracy on internal, external-I, external-II, 4-fold cross and 5-fold cross-validation, respectively. Whereas, kNNs results represented 90.0%, 72.00%, 74.66%, 89.33% and 90.3% of classification accuracy, respectively. The outcome of this study demonstrated that SVMs classified PM disease with a highest overall accuracy of 91.86% and 1.1211 seconds of processing time. Therefore, overall results concluded that the proposed study can significantly support an accurate and automatic identification and recognition of strawberry PM disease with SVMs classifier.

Keywords: powdery mildew, image processing, textural analysis, color co-occurrence matrix, support vector machines, k-nearest neighbors

Procedia PDF Downloads 122
15697 Intrusion Detection in Computer Networks Using a Hybrid Model of Firefly and Differential Evolution Algorithms

Authors: Mohammad Besharatloo

Abstract:

Intrusion detection is an important research topic in network security because of increasing growth in the use of computer network services. Intrusion detection is done with the aim of detecting the unauthorized use or abuse in the networks and systems by the intruders. Therefore, the intrusion detection system is an efficient tool to control the user's access through some predefined regulations. Since, the data used in intrusion detection system has high dimension, a proper representation is required to show the basis structure of this data. Therefore, it is necessary to eliminate the redundant features to create the best representation subset. In the proposed method, a hybrid model of differential evolution and firefly algorithms was employed to choose the best subset of properties. In addition, decision tree and support vector machine (SVM) are adopted to determine the quality of the selected properties. In the first, the sorted population is divided into two sub-populations. These optimization algorithms were implemented on these sub-populations, respectively. Then, these sub-populations are merged to create next repetition population. The performance evaluation of the proposed method is done based on KDD Cup99. The simulation results show that the proposed method has better performance than the other methods in this context.

Keywords: intrusion detection system, differential evolution, firefly algorithm, support vector machine, decision tree

Procedia PDF Downloads 94
15696 Managing the Cloud Procurement Process: Findings from a Case Study

Authors: Andreas Jede, Frank Teuteberg

Abstract:

Cloud computing (CC) has already gained overall appreciation in research and practice. Whereas the willingness to integrate cloud services in various IT environments is still unbroken, the previous CC procurement processes run mostly in an unorganized and non-standardized way. In practice, a sufficiently specific, yet applicable business process for the important acquisition phase is often lacking. And research does not appropriately remedy this deficiency yet. Therefore, this paper introduces a field-tested approach for CC procurement. Based on an extensive literature review and augmented by expert interviews, we designed a model that is validated and further refined through an in-depth real-life case study. For the detailed process description, we apply the event-driven process chain notation (EPC). The gained valuable insights into the case study may help CC research to shift to a more socio-technical area. For practice, next to giving useful organizational instructions we will provide extended checklists and lessons learned.

Keywords: cloud procurement process, IT-organization, event-driven process chain, in-depth case study

Procedia PDF Downloads 394
15695 Solar Power Forecasting for the Bidding Zones of the Italian Electricity Market with an Analog Ensemble Approach

Authors: Elena Collino, Dario A. Ronzio, Goffredo Decimi, Maurizio Riva

Abstract:

The rapid increase of renewable energy in Italy is led by wind and solar installations. The 2017 Italian energy strategy foresees a further development of these sustainable technologies, especially solar. This fact has resulted in new opportunities, challenges, and different problems to deal with. The growth of renewables allows to meet the European requirements regarding energy and environmental policy, but these types of sources are difficult to manage because they are intermittent and non-programmable. Operationally, these characteristics can lead to instability on the voltage profile and increasing uncertainty on energy reserve scheduling. The increasing renewable production must be considered with more and more attention especially by the Transmission System Operator (TSO). The TSO, in fact, every day provides orders on energy dispatch, once the market outcome has been determined, on extended areas, defined mainly on the basis of power transmission limitations. In Italy, six market zone are defined: Northern-Italy, Central-Northern Italy, Central-Southern Italy, Southern Italy, Sardinia, and Sicily. An accurate hourly renewable power forecasting for the day-ahead on these extended areas brings an improvement both in terms of dispatching and reserve management. In this study, an operational forecasting tool of the hourly solar output for the six Italian market zones is presented, and the performance is analysed. The implementation is carried out by means of a numerical weather prediction model, coupled with a statistical post-processing in order to derive the power forecast on the basis of the meteorological projection. The weather forecast is obtained from the limited area model RAMS on the Italian territory, initialized with IFS-ECMWF boundary conditions. The post-processing calculates the solar power production with the Analog Ensemble technique (AN). This statistical approach forecasts the production using a probability distribution of the measured production registered in the past when the weather scenario looked very similar to the forecasted one. The similarity is evaluated for the components of the solar radiation: global (GHI), diffuse (DIF) and direct normal (DNI) irradiation, together with the corresponding azimuth and zenith solar angles. These are, in fact, the main factors that affect the solar production. Considering that the AN performance is strictly related to the length and quality of the historical data a training period of more than one year has been used. The training set is made by historical Numerical Weather Prediction (NWP) forecasts at 12 UTC for the GHI, DIF and DNI variables over the Italian territory together with corresponding hourly measured production for each of the six zones. The AN technique makes it possible to estimate the aggregate solar production in the area, without information about the technologic characteristics of the all solar parks present in each area. Besides, this information is often only partially available. Every day, the hourly solar power forecast for the six Italian market zones is made publicly available through a website.

Keywords: analog ensemble, electricity market, PV forecast, solar energy

Procedia PDF Downloads 159
15694 Unusual Presentation of Colorectal Cancer within Inguinal Hernia: A Systemic Review of Reported Cases

Authors: Sena Park

Abstract:

Background: The concurrent presentation with colorectal cancer in the inguinal hernia has been extremely rare. Due to its rarity, its presentation may lead to diagnostic and therapeutic dilemmas. We aim to review all the reported cases on colorectal cancer incarcerated in the inguinal hernia in the last 20 years, and discuss the operative approaches. Methods: We identified all case reports on colorectal cancer within inguinal hernia using PUBMED (2002-2022) and MEDLINE (2002-2022). The search strategy included the following keywords: colorectal cancer (title/abstract) AND inguinal hernia (title/abstract) OR incarceration (title/abstract). The search did not include letters, book chapters, systemic reviews, meta-analysis and editorials. Results: In the last 20 years, a total of 19 cases on colorectal cancer within the inguinal hernia were identified. The age of the patients ranged between 48 and 89. Majority of the patients were male (95%). Most commonly involved part of the large intestine was sigmoid colon (79%). Of all the cases, 79 percent of patients received open procedure and 21 percent had laparoscopic procedure. Discussion: Inguinal hernias are common with an incidence of approximately 1.7 percent. Colorectal cancer is the one of the leading causes of cancer-related mortality worldwide. However, their concurrent presentation has been extremely rare. In the last 20 years, 19 cases on concurrent presentation of colorectal cancer and inguinal hernia have been reported. Most patients who had open procedures had two incisions of groin incision and a midline laparotomy. There were 4 cases where the oncological resection was performed laparoscopically. The advantages of laparoscopic resection include reduced blood lost, reduced post-operative pain, reduced length of hospital stay and similar number of lymph nodes taken. From the review of the cases in the last 20 years, both open and laparoscopic approaches seemed to be safe and achieve adequate oncological resections. Conclusion: This is a brief overview of reported cases of colorectal cancer presenting with inguinal hernia concurrently. Due to its rarity, there are no current guidelines on operative approach in clinical practice. The experience in the last 20 years supports both open and laparoscopic approach.

Keywords: colorectal cancer, inguinal hernia, incarceration, operative approach

Procedia PDF Downloads 101
15693 Online Handwritten Character Recognition for South Indian Scripts Using Support Vector Machines

Authors: Steffy Maria Joseph, Abdu Rahiman V, Abdul Hameed K. M.

Abstract:

Online handwritten character recognition is a challenging field in Artificial Intelligence. The classification success rate of current techniques decreases when the dataset involves similarity and complexity in stroke styles, number of strokes and stroke characteristics variations. Malayalam is a complex south indian language spoken by about 35 million people especially in Kerala and Lakshadweep islands. In this paper, we consider the significant feature extraction for the similar stroke styles of Malayalam. This extracted feature set are suitable for the recognition of other handwritten south indian languages like Tamil, Telugu and Kannada. A classification scheme based on support vector machines (SVM) is proposed to improve the accuracy in classification and recognition of online malayalam handwritten characters. SVM Classifiers are the best for real world applications. The contribution of various features towards the accuracy in recognition is analysed. Performance for different kernels of SVM are also studied. A graphical user interface has developed for reading and displaying the character. Different writing styles are taken for each of the 44 alphabets. Various features are extracted and used for classification after the preprocessing of input data samples. Highest recognition accuracy of 97% is obtained experimentally at the best feature combination with polynomial kernel in SVM.

Keywords: SVM, matlab, malayalam, South Indian scripts, onlinehandwritten character recognition

Procedia PDF Downloads 576
15692 The Russian Preposition 'за': A Cognitive Linguistic Approach

Authors: M. Kalyuga

Abstract:

Prepositions have long been considered to be one of the major challenges for second language learners, since they have multiple uses that differ greatly from one language to another. The traditional approach to second language teaching supplies students with a list of uses of a preposition that they have to memorise and no explanation is provided. Contrary to the traditional grammar approach, the cognitive linguistic approach offers an explanation for the use of prepositions and provides strategies to comprehend and learn prepositions that would be otherwise seem obscure. The present paper demonstrates the use of the cognitive approach for the explanation of prepositions through the example of the Russian preposition 'за'. The paper demonstrates how various spatial and non-spatial uses of this preposition are linked together through metaphorical and metonymical mapping. The diversity of expressions with за is explained by the range of spatial scenes this preposition is associated with.

Keywords: language teaching, Russian, preposition 'за', cognitive approach

Procedia PDF Downloads 452
15691 Whether Buffer Zone Community Forests’ Benefits Are Distributed Fairly to Low-Income Users: Reflection From the Buffer Zone Community Forests in Bardia National Park, Nepal

Authors: Keshav Raj Acharya, Thakur Silwal, Neelam C. Poudyal

Abstract:

Buffer zones, the peripheral areas around the national parks and wildlife reserves, are available for the purpose of benefitting the local inhabitants by providing forest products for subsistence needs of basic forest products outside the protected areas. The forest area within the buffer zone has been managed as a buffer zone community forest (BZCF) for the last 25 years after the approval of the buffer zone management regulation 1996. With a case study of select BZCF in Bardia National Park, this study aims to analyze whether the benefit provided by BZCF is equally available to poor users among other socioeconomic classes of the users. The findings are based on the analysis of cross-sectional data involving household surveys (n=305) and key informants’ interviews (n=10) as well as office records available at different 5 buffer zone community forest user groups offices. Results indicate that despite the provisions of subsidized rates for poor; poor households were more deprived due to higher forest products price particularly, the timber price in buffer zone. Evidence also indicate that due to the increased forest coverage, the incidence of wildlife damage has also increased and impacted the poor more due to lack of land ownership as well as limited alternatives. Clear community forest management guidelines with equitable benefit sharing and compensatory mechanisms to the users of poor socioeconomic class have been identified as a solution to increase the benefit to poor users in BZCFUGs.

Keywords: crop depredation, forest products, users, wellbeing ranking

Procedia PDF Downloads 56
15690 Measuring Financial Asset Return and Volatility Spillovers, with Application to Sovereign Bond, Equity, Foreign Exchange and Commodity Markets

Authors: Petra Palic, Maruska Vizek

Abstract:

We provide an in-depth analysis of interdependence of asset returns and volatilities in developed and developing countries. The analysis is split into three parts. In the first part, we use multivariate GARCH model in order to provide stylized facts on cross-market volatility spillovers. In the second part, we use a generalized vector autoregressive methodology developed by Diebold and Yilmaz (2009) in order to estimate separate measures of return spillovers and volatility spillovers among sovereign bond, equity, foreign exchange and commodity markets. In particular, our analysis is focused on cross-market return, and volatility spillovers in 19 developed and developing countries. In order to estimate named spillovers, we use daily data from 2008 to 2017. In the third part of the analysis, we use a generalized vector autoregressive framework in order to estimate total and directional volatility spillovers. We use the same daily data span for one developed and one developing country in order to characterize daily volatility spillovers across stock, bond, foreign exchange and commodities markets.

Keywords: cross-market spillovers, sovereign bond markets, equity markets, value at risk (VAR)

Procedia PDF Downloads 264
15689 Money Laundering Risk Assessment in the Banking Institutions: An Experimental Approach

Authors: Yusarina Mat-Isa, Zuraidah Mohd-Sanusi, Mohd-Nizal Haniff, Paul A. Barnes

Abstract:

In view that money laundering has become eminent for banking institutions, it is an obligation for the banking institutions to adopt a risk-based approach as the integral component of the accepted policies on anti-money laundering. In doing so, those involved with the banking operations are the most critical group of personnel as these are the people who deal with the day-to-day operations of the banking institutions and are obligated to form a judgement on the level of impending risk. This requirement is extended to all relevant banking institutions staff, such as tellers and customer account representatives for them to identify suspicious customers and escalate it to the relevant authorities. Banking institutions staffs, however, face enormous challenges in identifying and distinguishing money launderers from other legitimate customers seeking genuine banking transactions. Banking institutions staffs are mostly educated and trained with the business objective in mind to serve the customers and are not trained to be “detectives with a detective’s power of observation”. Despite increasing awareness as well as trainings conducted for the banking institutions staff, their competency in assessing money laundering risk is still insufficient. Several gaps have prompted this study including the lack of behavioural perspectives in the assessment of money laundering risk in the banking institutions. Utilizing experimental approach, respondents are randomly assigned within a controlled setting with manipulated situations upon which judgement of the respondents is solicited based on various observations related to the situations. The study suggests that it is imperative that informed judgement is exercised in arriving at the decision to proceed with the banking services required by the customers. Judgement forms a basis of opinion for the banking institution staff to decide if the customers posed money laundering risk. Failure to exercise good judgement could results in losses and absorption of unnecessary risk into the banking institutions. Although the banking institutions are exposed with choices of automated solutions in assessing money laundering risk, the human factor in assessing the risk is indispensable. Individual staff in the banking institutions is the first line of defence who are responsible for screening the impending risk of any customer soliciting for banking services. At the end of the spectrum, the individual role involvement on the subject of money laundering risk assessment is not a substitute for automated solutions as human judgement is inimitable.

Keywords: banking institutions, experimental approach, money laundering, risk assessment

Procedia PDF Downloads 267
15688 Facility Anomaly Detection with Gaussian Mixture Model

Authors: Sunghoon Park, Hank Kim, Jinwon An, Sungzoon Cho

Abstract:

Internet of Things allows one to collect data from facilities which are then used to monitor them and even predict malfunctions in advance. Conventional quality control methods focus on setting a normal range on a sensor value defined between a lower control limit and an upper control limit, and declaring as an anomaly anything falling outside it. However, interactions among sensor values are ignored, thus leading to suboptimal performance. We propose a multivariate approach which takes into account many sensor values at the same time. In particular Gaussian Mixture Model is used which is trained to maximize likelihood value using Expectation-Maximization algorithm. The number of Gaussian component distributions is determined by Bayesian Information Criterion. The negative Log likelihood value is used as an anomaly score. The actual usage scenario goes like a following. For each instance of sensor values from a facility, an anomaly score is computed. If it is larger than a threshold, an alarm will go off and a human expert intervenes and checks the system. A real world data from Building energy system was used to test the model.

Keywords: facility anomaly detection, gaussian mixture model, anomaly score, expectation maximization algorithm

Procedia PDF Downloads 273
15687 Reliability-Based Maintenance Management Methodology to Minimise Life Cycle Cost of Water Supply Networks

Authors: Mojtaba Mahmoodian, Joshua Phelan, Mehdi Shahparvari

Abstract:

With a large percentage of countries’ total infrastructure expenditure attributed to water network maintenance, it is essential to optimise maintenance strategies to rehabilitate or replace underground pipes before failure occurs. The aim of this paper is to provide water utility managers with a maintenance management approach for underground water pipes, subject to external loading and material corrosion, to give the lowest life cycle cost over a predetermined time period. This reliability-based maintenance management methodology details the optimal years for intervention, the ideal number of maintenance activities to perform before replacement and specifies feasible renewal options and intervention prioritisation to minimise the life cycle cost. The study was then extended to include feasible renewal methods by determining the structural condition index and potential for soil loss, then obtaining the failure impact rating to assist in prioritising pipe replacement. A case study on optimisation of maintenance plans for the Melbourne water pipe network is considered in this paper to evaluate the practicality of the proposed methodology. The results confirm that the suggested methodology can provide water utility managers with a reliable systematic approach to determining optimum maintenance plans for pipe networks.

Keywords: water pipe networks, maintenance management, reliability analysis, optimum maintenance plan

Procedia PDF Downloads 156
15686 Insecticide Resistance Detection on Dengue Vector, Aedes albopictus Obtained from Kapit, Kuching and Sibu Districts in Sarawak State, Malaysia

Authors: Koon Weng Lau, Chee Dhang Chen, Abdul Aziz Azidah, Mohd Sofian-Azirun

Abstract:

Recently, Sarawak state of Malaysia encounter an outbreak of dengue fever. Aedes albopictus has incriminated as one of the important vectors of dengue transmission. Without an effective vaccine, approaches to control or prevent dengue will be a focus on the vectors. The control of Aedes mosquitoes is still dependent on the use of chemical insecticides and insecticide resistance represents a threat to the effectiveness of vector control. This study was conducted to determine the resistance status of 11 active ingredients representing four major insecticide classes: DDT, dieldrin, malathion, fenitrothion, bendiocarb, propoxur, etofenprox, deltamethrin, lambda-cyhalothrin, cyfluthrin, and permethrin. Standard WHO test procedures were conducted to determine the insecticide susceptibility. Aedes albopictus collected from Kapit (resistance ratio, RR = 1.04–3.02), Kuching (RR = 1.17–4.61), and Sibu (RR = 1.06–3.59) exhibited low resistance toward all insecticides except dieldrin. This study reveled that dieldrin is still effective against Ae. albopictus, followed by fenitrothion, cyfluthrin, and deltamethrin. In conclusion, Ae. albopictus in Sarawak exhibited different resistance levels toward various insecticides and alternative solutions should be implemented to prevent further deterioration of the condition.

Keywords: Aedes albopictus, dengue, insecticide resistance, Malaysia

Procedia PDF Downloads 354
15685 Local Interpretable Model-agnostic Explanations (LIME) Approach to Email Spam Detection

Authors: Rohini Hariharan, Yazhini R., Blessy Maria Mathew

Abstract:

The task of detecting email spam is a very important one in the era of digital technology that needs effective ways of curbing unwanted messages. This paper presents an approach aimed at making email spam categorization algorithms transparent, reliable and more trustworthy by incorporating Local Interpretable Model-agnostic Explanations (LIME). Our technique assists in providing interpretable explanations for specific classifications of emails to help users understand the decision-making process by the model. In this study, we developed a complete pipeline that incorporates LIME into the spam classification framework and allows creating simplified, interpretable models tailored to individual emails. LIME identifies influential terms, pointing out key elements that drive classification results, thus reducing opacity inherent in conventional machine learning models. Additionally, we suggest a visualization scheme for displaying keywords that will improve understanding of categorization decisions by users. We test our method on a diverse email dataset and compare its performance with various baseline models, such as Gaussian Naive Bayes, Multinomial Naive Bayes, Bernoulli Naive Bayes, Support Vector Classifier, K-Nearest Neighbors, Decision Tree, and Logistic Regression. Our testing results show that our model surpasses all other models, achieving an accuracy of 96.59% and a precision of 99.12%.

Keywords: text classification, LIME (local interpretable model-agnostic explanations), stemming, tokenization, logistic regression.

Procedia PDF Downloads 48
15684 Small Target Recognition Based on Trajectory Information

Authors: Saad Alkentar, Abdulkareem Assalem

Abstract:

Recognizing small targets has always posed a significant challenge in image analysis. Over long distances, the image signal-to-noise ratio tends to be low, limiting the amount of useful information available to detection systems. Consequently, visual target recognition becomes an intricate task to tackle. In this study, we introduce a Track Before Detect (TBD) approach that leverages target trajectory information (coordinates) to effectively distinguish between noise and potential targets. By reframing the problem as a multivariate time series classification, we have achieved remarkable results. Specifically, our TBD method achieves an impressive 97% accuracy in separating target signals from noise within a mere half-second time span (consisting of 10 data points). Furthermore, when classifying the identified targets into our predefined categories—airplane, drone, and bird—we achieve an outstanding classification accuracy of 96% over a more extended period of 1.5 seconds (comprising 30 data points).

Keywords: small targets, drones, trajectory information, TBD, multivariate time series

Procedia PDF Downloads 48
15683 A Comparative Study of Optimization Techniques and Models to Forecasting Dengue Fever

Authors: Sudha T., Naveen C.

Abstract:

Dengue is a serious public health issue that causes significant annual economic and welfare burdens on nations. However, enhanced optimization techniques and quantitative modeling approaches can predict the incidence of dengue. By advocating for a data-driven approach, public health officials can make informed decisions, thereby improving the overall effectiveness of sudden disease outbreak control efforts. The National Oceanic and Atmospheric Administration and the Centers for Disease Control and Prevention are two of the U.S. Federal Government agencies from which this study uses environmental data. Based on environmental data that describe changes in temperature, precipitation, vegetation, and other factors known to affect dengue incidence, many predictive models are constructed that use different machine learning methods to estimate weekly dengue cases. The first step involves preparing the data, which includes handling outliers and missing values to make sure the data is prepared for subsequent processing and the creation of an accurate forecasting model. In the second phase, multiple feature selection procedures are applied using various machine learning models and optimization techniques. During the third phase of the research, machine learning models like the Huber Regressor, Support Vector Machine, Gradient Boosting Regressor (GBR), and Support Vector Regressor (SVR) are compared with several optimization techniques for feature selection, such as Harmony Search and Genetic Algorithm. In the fourth stage, the model's performance is evaluated using Mean Square Error (MSE), Mean Absolute Error (MAE), and Root Mean Square Error (RMSE) as assistance. Selecting an optimization strategy with the least number of errors, lowest price, biggest productivity, or maximum potential results is the goal. In a variety of industries, including engineering, science, management, mathematics, finance, and medicine, optimization is widely employed. An effective optimization method based on harmony search and an integrated genetic algorithm is introduced for input feature selection, and it shows an important improvement in the model's predictive accuracy. The predictive models with Huber Regressor as the foundation perform the best for optimization and also prediction.

Keywords: deep learning model, dengue fever, prediction, optimization

Procedia PDF Downloads 67
15682 Monetary Evaluation of Dispatching Decisions in Consideration of Choice of Transport

Authors: Marcel Schneider, Nils Nießen

Abstract:

Microscopic simulation programs enable the description of the two processes of railway operation and the previous timetabling. Occupation conflicts are often solved based on defined train priorities on both process levels. These conflict resolutions produce knock-on delays for the involved trains. The sum of knock-on delays is commonly used to evaluate the quality of railway operations. It is either compared to an acceptable level-of-service or the delays are evaluated economically by linearly monetary functions. It is impossible to properly evaluate dispatching decisions without a well-founded objective function. This paper presents a new approach for evaluation of dispatching decisions. It uses models of choice of transport and considers the behaviour of the end-costumers. These models evaluate the knock-on delays in more detail than linearly monetary functions and consider other competing modes of transport. The new approach pursues the coupling of a microscopic model of railway operation with the macroscopic model of choice of transport. First it will be implemented for the railway operations process, but it can also be used for timetabling. The evaluation considers the possibility to change over to other transport modes by the end-costumers. The new approach first looks at the rail-mounted and road transport, but it can also be extended to air transport. The split of the end-costumers is described by the modal-split. The reactions by the end-costumers have an effect on the revenues of the railway undertakings. Various travel purposes has different pavement reserves and tolerances towards delays. Longer journey times affect besides revenue changes also additional costs. The costs depend either on time or track and arise from circulation of workers and vehicles. Only the variable values are summarised in the contribution margin, which is the base for the monetary evaluation of the delays. The contribution margin is calculated for different resolution decisions of the same conflict. The conflict resolution is improved until the monetary loss becomes minimised. The iterative process therefore determines an optimum conflict resolution by observing the change of the contribution margin. Furthermore, a monetary value of each dispatching decision can also be determined.

Keywords: choice of transport, knock-on delays, monetary evaluation, railway operations

Procedia PDF Downloads 329
15681 A 7 Dimensional-Quantitative Structure-Activity Relationship Approach Combining Quantum Mechanics Based Grid and Solvation Models to Predict Hotspots and Kinetic Properties of Mutated Enzymes: An Enzyme Engineering Perspective

Authors: R. Pravin Kumar, L. Roopa

Abstract:

Enzymes are molecular machines used in various industries such as pharmaceuticals, cosmetics, food and animal feed, paper and leather processing, biofuel, and etc. Nevertheless, this has been possible only by the breath-taking efforts of the chemists and biologists to evolve/engineer these mysterious biomolecules to work the needful. Main agenda of this enzyme engineering project is to derive screening and selection tools to obtain focused libraries of enzyme variants with desired qualities. The methodologies for this research include the well-established directed evolution, rational redesign and relatively less established yet much faster and accurate insilico methods. This concept was initiated as a Receptor Rependent-4Dimensional Quantitative Structure Activity Relationship (RD-4D-QSAR) to predict kinetic properties of enzymes and extended here to study transaminase by a 7D QSAR approach. Induced-fit scenarios were explored using Quantum Mechanics/Molecular Mechanics (QM/MM) simulations which were then placed in a grid that stores interactions energies derived from QM parameters (QMgrid). In this study, the mutated enzymes were immersed completely inside the QMgrid and this was combined with solvation models to predict descriptors. After statistical screening of descriptors, QSAR models showed > 90% specificity and > 85% sensitivity towards the experimental activity. Mapping descriptors on the enzyme structure revealed hotspots important to enhance the enantioselectivity of the enzyme.

Keywords: QMgrid, QM/MM simulations, RD-4D-QSAR, transaminase

Procedia PDF Downloads 137
15680 A Case for Strategic Landscape Infrastructure: South Essex Estuary Park

Authors: Alexandra Steed

Abstract:

Alexandra Steed URBAN was commissioned to undertake the South Essex Green and Blue Infrastructure Study (SEGBI) on behalf of the Association of South Essex Local Authorities (ASELA): a partnership of seven neighboring councils within the Thames Estuary. Located on London’s doorstep, the 70,000-hectare region is under extraordinary pressure for regeneration, further development, and economic expansion, yet faces extreme challenges: sea-level rise and inadequate flood defenses, stormwater flooding and threatened infrastructure, loss of internationally important habitats, significant existing community deprivation, and lack of connectivity and access to green space. The brief was to embrace these challenges in the creation of a document that would form a key part of ASELA’s Joint Strategic Framework and feed into local plans and master plans. Thus, helping to tackle climate change, ecological collapse, and social inequity at a regional scale whilst creating a relationship and awareness between urban communities and the surrounding landscapes and nature. The SEGBI project applied a ‘land-based’ methodology, combined with a co-design approach involving numerous stakeholders, to explore how living infrastructure can address these significant issues, reshape future planning and development, and create thriving places for the whole community of life. It comprised three key stages, including Baseline Review; Green and Blue Infrastructure Assessment; and the final Green and Blue Infrastructure Report. The resulting proposals frame an ambitious vision for the delivery of a new regional South Essex Estuary (SEE) Park – 24,000 hectares of protected and connected landscapes. This unified parkland system will drive effective place-shaping and “leveling up” for the most deprived communities while providing large-scale nature recovery and biodiversity net gain. Comprehensive analysis and policy recommendations ensure best practices will be embedded within planning documents and decisions guiding future development. Furthermore, a Natural Capital Account was undertaken as part of the strategy showing the tremendous economic value of the natural assets. This strategy sets a pioneering precedent that demonstrates how the prioritisation of living infrastructure has the capacity to address climate change and ecological collapse, while also supporting sustainable housing, healthier communities, and resilient infrastructures. It was only achievable through a collaborative and cross-boundary approach to strategic planning and growth, with a shared vision of place, and a strong commitment to delivery. With joined-up thinking and a joined-up region, a more impactful plan for South Essex was developed that will lead to numerous environmental, social, and economic benefits across the region, and enhancing the landscape and natural environs on the periphery of one of the largest cities in the world.

Keywords: climate change, green and blue infrastructure, landscape architecture, master planning, regional planning, social equity

Procedia PDF Downloads 98
15679 Radar Fault Diagnosis Strategy Based on Deep Learning

Authors: Bin Feng, Zhulin Zong

Abstract:

Radar systems are critical in the modern military, aviation, and maritime operations, and their proper functioning is essential for the success of these operations. However, due to the complexity and sensitivity of radar systems, they are susceptible to various faults that can significantly affect their performance. Traditional radar fault diagnosis strategies rely on expert knowledge and rule-based approaches, which are often limited in effectiveness and require a lot of time and resources. Deep learning has recently emerged as a promising approach for fault diagnosis due to its ability to learn features and patterns from large amounts of data automatically. In this paper, we propose a radar fault diagnosis strategy based on deep learning that can accurately identify and classify faults in radar systems. Our approach uses convolutional neural networks (CNN) to extract features from radar signals and fault classify the features. The proposed strategy is trained and validated on a dataset of measured radar signals with various types of faults. The results show that it achieves high accuracy in fault diagnosis. To further evaluate the effectiveness of the proposed strategy, we compare it with traditional rule-based approaches and other machine learning-based methods, including decision trees, support vector machines (SVMs), and random forests. The results demonstrate that our deep learning-based approach outperforms the traditional approaches in terms of accuracy and efficiency. Finally, we discuss the potential applications and limitations of the proposed strategy, as well as future research directions. Our study highlights the importance and potential of deep learning for radar fault diagnosis. It suggests that it can be a valuable tool for improving the performance and reliability of radar systems. In summary, this paper presents a radar fault diagnosis strategy based on deep learning that achieves high accuracy and efficiency in identifying and classifying faults in radar systems. The proposed strategy has significant potential for practical applications and can pave the way for further research.

Keywords: radar system, fault diagnosis, deep learning, radar fault

Procedia PDF Downloads 92
15678 Wind Power Forecasting Using Echo State Networks Optimized by Big Bang-Big Crunch Algorithm

Authors: Amir Hossein Hejazi, Nima Amjady

Abstract:

In recent years, due to environmental issues traditional energy sources had been replaced by renewable ones. Wind energy as the fastest growing renewable energy shares a considerable percent of energy in power electricity markets. With this fast growth of wind energy worldwide, owners and operators of wind farms, transmission system operators, and energy traders need reliable and secure forecasts of wind energy production. In this paper, a new forecasting strategy is proposed for short-term wind power prediction based on Echo State Networks (ESN). The forecast engine utilizes state-of-the-art training process including dynamical reservoir with high capability to learn complex dynamics of wind power or wind vector signals. The study becomes more interesting by incorporating prediction of wind direction into forecast strategy. The Big Bang-Big Crunch (BB-BC) evolutionary optimization algorithm is adopted for adjusting free parameters of ESN-based forecaster. The proposed method is tested by real-world hourly data to show the efficiency of the forecasting engine for prediction of both wind vector and wind power output of aggregated wind power production.

Keywords: wind power forecasting, echo state network, big bang-big crunch, evolutionary optimization algorithm

Procedia PDF Downloads 573
15677 Regeneration of Geological Models Using Support Vector Machine Assisted by Principal Component Analysis

Authors: H. Jung, N. Kim, B. Kang, J. Choe

Abstract:

History matching is a crucial procedure for predicting reservoir performances and making future decisions. However, it is difficult due to uncertainties of initial reservoir models. Therefore, it is important to have reliable initial models for successful history matching of highly heterogeneous reservoirs such as channel reservoirs. In this paper, we proposed a novel scheme for regenerating geological models using support vector machine (SVM) and principal component analysis (PCA). First, we perform PCA for figuring out main geological characteristics of models. Through the procedure, permeability values of each model are transformed to new parameters by principal components, which have eigenvalues of large magnitude. Secondly, the parameters are projected into two-dimensional plane by multi-dimensional scaling (MDS) based on Euclidean distances. Finally, we train an SVM classifier using 20% models which show the most similar or dissimilar well oil production rates (WOPR) with the true values (10% for each). Then, the other 80% models are classified by trained SVM. We select models on side of low WOPR errors. One hundred channel reservoir models are initially generated by single normal equation simulation. By repeating the classification process, we can select models which have similar geological trend with the true reservoir model. The average field of the selected models is utilized as a probability map for regeneration. Newly generated models can preserve correct channel features and exclude wrong geological properties maintaining suitable uncertainty ranges. History matching with the initial models cannot provide trustworthy results. It fails to find out correct geological features of the true model. However, history matching with the regenerated ensemble offers reliable characterization results by figuring out proper channel trend. Furthermore, it gives dependable prediction of future performances with reduced uncertainties. We propose a novel classification scheme which integrates PCA, MDS, and SVM for regenerating reservoir models. The scheme can easily sort out reliable models which have similar channel trend with the reference in lowered dimension space.

Keywords: history matching, principal component analysis, reservoir modelling, support vector machine

Procedia PDF Downloads 160
15676 Real-time PCR to Determine Resistance Genes in ESBLEscherichia Coli Strains Stored in the Epidemic Diseases Laboratory of the National Institute of Hygiene (INH)

Authors: A. Qasmaoui, F. Ohmani, Z. Zaine, I. El Akrad, J. Hamamouchi, K. Halout, B. Belkadi, R. Charof

Abstract:

The evolution of antibiotic resistance is a crucial aspect of the problem related to the intensive use of these substances in medicine for humans and animals. The production of ESBL extended spectrum β-lactamase enzymes is the main mechanism of resistance to β-lactam antibiotics in Escherichia coli. The objective of our work is to determine the resistance genes in E. coli strains.ESBL coli stored at the epidemic diseases laboratory of the National Institute of Hygiene. The strains were identified according to the classic bacteriological criteria. An antibiogram was performed on the strains isolated by the Mueller Hinton agar disc diffusion method. The production of ESBL in the strains was detected by the synergy assay technique and confirmed for the presence of the blaCTX-M1, blaCTX-M2, blaTEM, blaSHV, blaOXA-48 genes by gene amplification . Of the 27 observed strains of E.coli, 17 isolated strains present the phenotype of extended-spectrum Beta-lactamase with a percentage of 63%.. All 18 cefotaxime-resistant strains were analyzed for an ESBL phenotype. All strains were positive in the double-disc synergy assay. The fight against the emergence and spread of these multi-resistant antibiotic-resistant strains requires the reasonable use of antibiotics.

Keywords: E coli, BLSE, CTX, TEM, SHV, OXA, résistance aux antibiotique

Procedia PDF Downloads 24
15675 Hominin Niche in the Times of Climate Change

Authors: Emilia Hunt, Sally C. Reynolds, Fiona Coward, Fabio Parracho Silva, Philip Hopley

Abstract:

Ecological niche modeling is widely used in conservation studies, but application to the extinct hominin species is a relatively new approach. Being able to understand what ecological niches were occupied by respective hominin species provides a new perspective into influences on evolutionary processes. Niche separation or overlap can tell us more about specific requirements of the species within the given timeframe. Many of the ancestral species lived through enormous climate changes: glacial and interglacial periods, changes in rainfall, leading to desertification or flooding of regions and displayed impressive levels of adaptation necessary for their survival. This paper reviews niche modeling methodologies and their application to hominin studies. Traditional conservation methods might not be directly applicable to extinct species and are not comparable to hominins. Hominin niche also includes aspects of technologies, use of fire and extended communication, which are not traditionally used in building conservation models. Future perspectives on how to improve niche modeling for extinct hominin species will be discussed.

Keywords: hominin niche, climate change, evolution, adaptation, ecological niche modelling

Procedia PDF Downloads 190
15674 The Impact of Step-By-Step Program in the Public Preschool Institutions in Kosova

Authors: Rozafa Shala

Abstract:

Development of preschool education in Kosovo has passed through several periods. The period after the 1999 war was very intensive period when preschool education started to change. Step-by-step program was one of the programs which were very well extended during the period after the 1999 war until now. The aim of this study is to present the impact of the step-by-step program in the preschool education. This research is based on the hypothesis that: Step-by-step program continues to be present with its elements, in all other programs that the teachers can use. For data collection a questionnaire is constructed which was distributed to 25 teachers of preschool education who work in public preschool institutions. All the teachers have finished the training for step by step program. To support the data from the questionnaire a focus group is also organized with whom the critical issues of the program were discussed. From the results obtained we can conclude that the step-by-step program has a very strong impact in the preschool level. Many specific elements such as: circle time, weather calendar, environment inside the class, portfolios and many other elements are present in most of the preschool classes. The teacher's approach also has many elements of the step-by-step program.

Keywords: preschool education, step-by-step program, impact, teachers

Procedia PDF Downloads 354
15673 Expression of Human Papillomavirus Type 18 L1 Virus-Like Particles in Methylotropic Yeast, Pichia Pastoris

Authors: Hossein Rassi, Marjan Moradi Fard, Samaneh Niko

Abstract:

Human papillomavirus type 16 and 18 are closely associated with the development of human cervical carcinoma, which is one of the most common causes of cancer death in women worldwide. At present, HPV type 18 accounts for about 34 % of all HPV infections in Iran and the most promising vaccine against HPV infection is based on the L1 major capsid protein. The L1 protein of HPV18 has the capacity to self-assemble into capsomers or virus-like particles (VLPs) that are non-infectious, highly immunogenic and allowing their use in vaccine production. The methylotrophic yeast Pichia pastoris is an efficient and inexpensive expression system used to produce high levels of heterologous proteins. In this study we expressed HPV18 L1 VLPs in P. pastoris. The gene encoding the major capsid protein L1 of the high-risk HPV type 18 was isolated from Iranian patient by PCR and inserted into pTG19-T vector to obtain the recombinant expression vector pTG19-HPV18-L1. Then, the pTG19-HPV18-L1 was transformed into E. coli strain DH5α and the recombinant protein HPV18 L1 was expressed under IPTG induction in soluble form. The HPV18 L1 gene was excised from recombinant plasmid with XhoI and EcoRI enzymes and ligated into the yeast expression vector pPICZα linearized with the same enzymes, and transformed into P. pastoris. Induction and expression of HPV18 L1 protein was demonstrated by BMGY/BMMY and RT PCR. The parameters for induced cultivation for strain in P. pastoris KM71 with HPV16L1 were investigated in shaking flask cultures. After induced cultivation BMMY (pH 7.0) medium supplemented with methanol to a final concentration of 1.0% every 24 h at 37 degrees C for 96 h, the recombinant produced 78.6 mg/L of L1 protein. This work offers the possibility for the production of prophylactic vaccine for cervical carcinoma by P. pastoris for HPV-18 L1 gene. The VLP-based HPV vaccines can prevent persistent HPV18 infections and cervical cancer in Iran. The HPV-18 L1 gene was expressed successfully in E.coli, which provides necessary basis for preparing HPV-18 L1 vaccine in human. Also, HPV type 6 L1 proteins expressed in Pichia pastoris will facilitate the HPV vaccine development and structure-function study.

Keywords: Pichia pastoris, L1 virus-like particles, human papillomavirus type 18, biotechnology

Procedia PDF Downloads 407
15672 Experiments on Weakly-Supervised Learning on Imperfect Data

Authors: Yan Cheng, Yijun Shao, James Rudolph, Charlene R. Weir, Beth Sahlmann, Qing Zeng-Treitler

Abstract:

Supervised predictive models require labeled data for training purposes. Complete and accurate labeled data, i.e., a ‘gold standard’, is not always available, and imperfectly labeled data may need to serve as an alternative. An important question is if the accuracy of the labeled data creates a performance ceiling for the trained model. In this study, we trained several models to recognize the presence of delirium in clinical documents using data with annotations that are not completely accurate (i.e., weakly-supervised learning). In the external evaluation, the support vector machine model with a linear kernel performed best, achieving an area under the curve of 89.3% and accuracy of 88%, surpassing the 80% accuracy of the training sample. We then generated a set of simulated data and carried out a series of experiments which demonstrated that models trained on imperfect data can (but do not always) outperform the accuracy of the training data, e.g., the area under the curve for some models is higher than 80% when trained on the data with an error rate of 40%. Our experiments also showed that the error resistance of linear modeling is associated with larger sample size, error type, and linearity of the data (all p-values < 0.001). In conclusion, this study sheds light on the usefulness of imperfect data in clinical research via weakly-supervised learning.

Keywords: weakly-supervised learning, support vector machine, prediction, delirium, simulation

Procedia PDF Downloads 200
15671 Control Power in Doubly Fed Induction Generator Wind Turbine with SVM Control Inverter

Authors: Zerzouri Nora, Benalia Nadia, Bensiali Nadia

Abstract:

This paper presents a grid-connected wind power generation scheme using Doubly Fed Induction Generator (DFIG). This can supply power at constant voltage and constant frequency with the rotor speed varying. This makes it suitable for variable speed wind energy application. The DFIG system consists of wind turbine, asynchronous wound rotor induction generator, and inverter with Space Vector Modulation (SVM) controller. In which the stator is connected directly to the grid and the rotor winding is in interface with rotor converter and grid converter. The use of back-to-back SVM converter in the rotor circuit results in low distortion current, reactive power control and operate at variable speed. Mathematical modeling of the DFIG is done in order to analyze the performance of the systems and they are simulated using MATLAB. The simulation results for the system are obtained and hence it shows that the system can operate at variable speed with low harmonic current distortion. The objective is to track and extract maximum power from the wind energy system and transfer it to the grid for useful work.

Keywords: Doubly Fed Induction Generator, Wind Energy Conversion Systems, Space Vector Modulation, distortion harmonics

Procedia PDF Downloads 485