Search results for: common vector approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 19292

Search results for: common vector approach

18752 A Dynamic Software Product Line Approach to Self-Adaptive Genetic Algorithms

Authors: Abdelghani Alidra, Mohamed Tahar Kimour

Abstract:

Genetic algorithm must adapt themselves at design time to cope with the search problem specific requirements and at runtime to balance exploration and convergence objectives. In a previous article, we have shown that modeling and implementing Genetic Algorithms (GA) using the software product line (SPL) paradigm is very appreciable because they constitute a product family sharing a common base of code. In the present article we propose to extend the use of the feature model of the genetic algorithms family to model the potential states of the GA in what is called a Dynamic Software Product Line. The objective of this paper is the systematic generation of a reconfigurable architecture that supports the dynamic of the GA and which is easily deduced from the feature model. The resultant GA is able to perform dynamic reconfiguration autonomously to fasten the convergence process while producing better solutions. Another important advantage of our approach is the exploitation of recent advances in the domain of dynamic SPLs to enhance the performance of the GAs.

Keywords: self-adaptive genetic algorithms, software engineering, dynamic software product lines, reconfigurable architecture

Procedia PDF Downloads 285
18751 A Three-Step Iterative Process for Common Fixed Points of Three Contractive-Like Operators

Authors: Safeer Hussain Khan, H. Fukhar-ud-Din

Abstract:

The concept of quasi-contractive type operators was given by Berinde and extended by Imoru and Olatinwo. They named this new type as contractive-like operators. On the other hand, Xu and Noo introduced a three-step-one-mappings iterative process which can be seen as a generalization of Mann and Ishikawa iterative processes. Approximating common fixed points has its own importance as it has a direct link with minimization problem. Motivated by this, in this paper, we first extend the iterative process of Xu and Noor to the case of three-step-three-mappings and then prove a strong convergence result using contractive-like operators for this iterative process. In general, this generalizes corresponding results using Mann, Ishikawa and Xu-Noor iterative processes with quasi-contractive type operators. It is to be pointed out that our results can also be proved with iterative process involving error terms.

Keywords: contractive-like operator, iterative process, common fixed point, strong convergence

Procedia PDF Downloads 594
18750 Comparison of Buyback Contracts and Concession Regimes in the Regime of the Common Law System and the Islamic Legal Regime

Authors: Javid Zarei

Abstract:

International buyback contracts are a type of contract service. These kinds of contracts are the most important instrument for attracting foreign investors in accordance with Iran's laws. These contracts have been the basis of commercial and economic relations between Iran and foreign companies for about 30 years. The legal structure of this type of contract has gradually evolved, so today, an advanced generation of it under the title of Iran Petroleum Contract is being used in the industry of Iran. This article has analytically examined the issue of Iran's commercial contracts in the oil industry and contracting services and allocated sections to examine the strengths and weaknesses of these oil contracts. Also, this research is an attempt to examine and compare the Concession regime with the Buyback contracts, each of which is derived from the common law legal system and the Islamic legal system, respectively.

Keywords: buyback contracts, concession regime, ownership, common law legal system, Islamic legal system of Iran

Procedia PDF Downloads 82
18749 Refinery Sulfur as an Alternative Agent to Decrease Pesticide Exposure in Pistachio Orchards and Common Pistachio Psylla’s Control

Authors: Mehdi Basirat, Mohammad Rouhani, Shahla Borzouei, Majid Zarangi, Asma Abolghasemi, Mohammad Fazel Soltani, Mohammad Gorji, Mohammad Amin Samih

Abstract:

The common pistachio psylla, Agonoscena pistaciae Burckhardt and Lauterer (Hemiptera: Aphalaridae), as one of the most detrimental pests in all pistachio producing regions, causes great economic damages to pistachio trees. Nowadays, various pesticides are used to control the common pistachio psylla and robust pesticide exposure has occurred in orchards. In this study, field experiments were conducted during 2018–2021 to assess the effects of sulfur on A. pistaciae. This study compared sulfur with asafoetida extract and pesticide (acetamiprid) on A. pistaciae based on complete randomized blocks with three replications. The analysis results of variance showed that the effect of treatments on egg (F2,24 = 17.61, P = 0.00) and nymphs (F2,24 = 18.29, P = 0.00) had a significant difference at a 1% level. The results demonstrated that sulfur had the highest measure of control on eggs and nymphs significantly compared to the plant extract and pesticide (negative control). These results provide support to the potential use of sulfur as an alternative pest management tool against A. pistaciae. The results clearly indicated that sulfur could control the common pistachio psylla population for six weeks at least.

Keywords: Agonoscena pistaciae, pesticide exposure, pistachio, sulfur

Procedia PDF Downloads 165
18748 Radar Fault Diagnosis Strategy Based on Deep Learning

Authors: Bin Feng, Zhulin Zong

Abstract:

Radar systems are critical in the modern military, aviation, and maritime operations, and their proper functioning is essential for the success of these operations. However, due to the complexity and sensitivity of radar systems, they are susceptible to various faults that can significantly affect their performance. Traditional radar fault diagnosis strategies rely on expert knowledge and rule-based approaches, which are often limited in effectiveness and require a lot of time and resources. Deep learning has recently emerged as a promising approach for fault diagnosis due to its ability to learn features and patterns from large amounts of data automatically. In this paper, we propose a radar fault diagnosis strategy based on deep learning that can accurately identify and classify faults in radar systems. Our approach uses convolutional neural networks (CNN) to extract features from radar signals and fault classify the features. The proposed strategy is trained and validated on a dataset of measured radar signals with various types of faults. The results show that it achieves high accuracy in fault diagnosis. To further evaluate the effectiveness of the proposed strategy, we compare it with traditional rule-based approaches and other machine learning-based methods, including decision trees, support vector machines (SVMs), and random forests. The results demonstrate that our deep learning-based approach outperforms the traditional approaches in terms of accuracy and efficiency. Finally, we discuss the potential applications and limitations of the proposed strategy, as well as future research directions. Our study highlights the importance and potential of deep learning for radar fault diagnosis. It suggests that it can be a valuable tool for improving the performance and reliability of radar systems. In summary, this paper presents a radar fault diagnosis strategy based on deep learning that achieves high accuracy and efficiency in identifying and classifying faults in radar systems. The proposed strategy has significant potential for practical applications and can pave the way for further research.

Keywords: radar system, fault diagnosis, deep learning, radar fault

Procedia PDF Downloads 90
18747 A Comparative Study of Optimization Techniques and Models to Forecasting Dengue Fever

Authors: Sudha T., Naveen C.

Abstract:

Dengue is a serious public health issue that causes significant annual economic and welfare burdens on nations. However, enhanced optimization techniques and quantitative modeling approaches can predict the incidence of dengue. By advocating for a data-driven approach, public health officials can make informed decisions, thereby improving the overall effectiveness of sudden disease outbreak control efforts. The National Oceanic and Atmospheric Administration and the Centers for Disease Control and Prevention are two of the U.S. Federal Government agencies from which this study uses environmental data. Based on environmental data that describe changes in temperature, precipitation, vegetation, and other factors known to affect dengue incidence, many predictive models are constructed that use different machine learning methods to estimate weekly dengue cases. The first step involves preparing the data, which includes handling outliers and missing values to make sure the data is prepared for subsequent processing and the creation of an accurate forecasting model. In the second phase, multiple feature selection procedures are applied using various machine learning models and optimization techniques. During the third phase of the research, machine learning models like the Huber Regressor, Support Vector Machine, Gradient Boosting Regressor (GBR), and Support Vector Regressor (SVR) are compared with several optimization techniques for feature selection, such as Harmony Search and Genetic Algorithm. In the fourth stage, the model's performance is evaluated using Mean Square Error (MSE), Mean Absolute Error (MAE), and Root Mean Square Error (RMSE) as assistance. Selecting an optimization strategy with the least number of errors, lowest price, biggest productivity, or maximum potential results is the goal. In a variety of industries, including engineering, science, management, mathematics, finance, and medicine, optimization is widely employed. An effective optimization method based on harmony search and an integrated genetic algorithm is introduced for input feature selection, and it shows an important improvement in the model's predictive accuracy. The predictive models with Huber Regressor as the foundation perform the best for optimization and also prediction.

Keywords: deep learning model, dengue fever, prediction, optimization

Procedia PDF Downloads 65
18746 Wind Power Forecasting Using Echo State Networks Optimized by Big Bang-Big Crunch Algorithm

Authors: Amir Hossein Hejazi, Nima Amjady

Abstract:

In recent years, due to environmental issues traditional energy sources had been replaced by renewable ones. Wind energy as the fastest growing renewable energy shares a considerable percent of energy in power electricity markets. With this fast growth of wind energy worldwide, owners and operators of wind farms, transmission system operators, and energy traders need reliable and secure forecasts of wind energy production. In this paper, a new forecasting strategy is proposed for short-term wind power prediction based on Echo State Networks (ESN). The forecast engine utilizes state-of-the-art training process including dynamical reservoir with high capability to learn complex dynamics of wind power or wind vector signals. The study becomes more interesting by incorporating prediction of wind direction into forecast strategy. The Big Bang-Big Crunch (BB-BC) evolutionary optimization algorithm is adopted for adjusting free parameters of ESN-based forecaster. The proposed method is tested by real-world hourly data to show the efficiency of the forecasting engine for prediction of both wind vector and wind power output of aggregated wind power production.

Keywords: wind power forecasting, echo state network, big bang-big crunch, evolutionary optimization algorithm

Procedia PDF Downloads 572
18745 Art Street as a Way for Reflective Thinking in the Filed of Adult and Primary Education: Examples of Educational Techniques

Authors: Georgia H. Mega

Abstract:

Art street, a category of artwork displayed in public spaces, has been recognized as a potential tool for promoting reflective thinking in both adult and primary education. Educational techniques that encourage critical and creative thinking, as well as deeper reflection, have been developed and applied in educational curricula. This paper aims to explore the potential of art street in cultivating learners' reflective awareness toward multiculturalism. The main objective of this case study is to investigate the possibilities that art street offers in terms of developing learners' critical reflection, regardless of their age. The study compares two art street works from Greece and Norway, focusing on their common theme of multiculturalism. The study adopts a qualitative methodology, specifically a case study approach. This approach allows for an in-depth analysis of the two selected art street works and their impact on learners' reflective thinking. The study demonstrates that art street can effectively cultivate learners' reflective awareness of multiculturalism. The selected works of art, despite being created by different artists and displayed in different cities, share similar content and convey messages that facilitate reflective dialogue on cultural osmosis. Both adult and primary education approaches utilize the same art street works to achieve reflective awareness. This paper contributes to the existing literature on reflective learning processes by highlighting the potential of art street as a means for encouraging reflective thinking. It builds upon the theoretical frameworks of adult education theorists such as Freire and Mezirow, as well as those of primary education theorists such as Perkins and Project Zero. Data for this study were collected through observation and analysis of two art street works, one from Greece and one from Norway. These works were selected based on their common theme of multiculturalism. Analysis Procedures: The collected data were analyzed using qualitative analysis techniques. The researchers examined the content and messages conveyed by the selected art street works and explored their impact on learners' reflective thinking. The central question addressed in this study is whether art street can develop learners' critical reflection toward multiculturalism, regardless of their age. The findings of this study support the notion that art street can effectively cultivate learners' reflective awareness toward multiculturalism. The selected art street works, despite their differences in origin and location, share common themes that encourage reflective dialogue. The use of art street in both adult and primary education approaches showcases its potential as a tool for promoting reflective learning processes. Overall, this paper contributes to the understanding of art street as a means for reflective thinking in the field of adult and primary education.

Keywords: art street, educational techniques, multiculturalism, observation of artworks, reflective awareness

Procedia PDF Downloads 75
18744 Statistical Analysis of Extreme Flow (Regions of Chlef)

Authors: Bouthiba Amina

Abstract:

The estimation of the statistics bound to the precipitation represents a vast domain, which puts numerous challenges to meteorologists and hydrologists. Sometimes, it is necessary, to approach in value the extreme events for sites where there is little, or no datum, as well as their periods of return. The search for a model of the frequency of the heights of daily rains dresses a big importance in operational hydrology: It establishes a basis for predicting the frequency and intensity of floods by estimating the amount of precipitation in past years. The most known and the most common approach is the statistical approach, It consists in looking for a law of probability that fits best the values observed by the random variable " daily maximal rain " after a comparison of various laws of probability and methods of estimation by means of tests of adequacy. Therefore, a frequent analysis of the annual series of daily maximal rains was realized on the data of 54 pluviometric stations of the pond of high and average. This choice was concerned with five laws usually applied to the study and the analysis of frequent maximal daily rains. The chosen period is from 1970 to 2013. It was of use to the forecast of quantiles. The used laws are the law generalized by extremes to three components, those of the extreme values to two components (Gumbel and log-normal) in two parameters, the law Pearson typifies III and Log-Pearson III in three parameters. In Algeria, Gumbel's law has been used for a long time to estimate the quantiles of maximum flows. However, and we will check and choose the most reliable law.

Keywords: return period, extreme flow, statistics laws, Gumbel, estimation

Procedia PDF Downloads 78
18743 Development of a Complete Single Jet Common Rail Injection System Gas Dynamic Model for Hydrogen Fueled Engine with Port Injection Feeding System

Authors: Mohammed Kamil, M. M. Rahman, Rosli A. Bakar

Abstract:

Modeling of hydrogen fueled engine (H2ICE) injection system is a very important tool that can be used for explaining or predicting the effect of advanced injection strategies on combustion and emissions. In this paper, a common rail injection system (CRIS) is proposed for 4-strokes 4-cylinders hydrogen fueled engine with port injection feeding system (PIH2ICE). For this system, a numerical one-dimensional gas dynamic model is developed considering single injection event for each injector per a cycle. One-dimensional flow equations in conservation form are used to simulate wave propagation phenomenon throughout the CR (accumulator). Using this model, the effect of common rail on the injection system characteristics is clarified. These characteristics include: rail pressure, sound velocity, rail mass flow rate, injected mass flow rate and pressure drop across injectors. The interaction effects of operational conditions (engine speed and rail pressure) and geometrical features (injector hole diameter) are illustrated; and the required compromised solutions are highlighted. The CRIS is shown to be a promising enhancement for PIH2ICE.

Keywords: common rail, hydrogen engine, port injection, wave propagation

Procedia PDF Downloads 424
18742 A Collaborative Problem Driven Approach to Design an HR Analytics Application

Authors: L. Atif, C. Rosenthal-Sabroux, M. Grundstein

Abstract:

The requirements engineering process is a crucial phase in the design of complex systems. The purpose of our research is to present a collaborative problem-driven requirements engineering approach that aims at improving the design of a Decision Support System as an Analytics application. This approach has been adopted to design a Human Resource management DSS. The Requirements Engineering process is presented as a series of guidelines for activities that must be implemented to assure that the final product satisfies end-users requirements and takes into account the limitations identified. For this, we know that a well-posed statement of the problem is “a problem whose crucial character arises from collectively produced estimation and a formulation found to be acceptable by all the parties”. Moreover, we know that DSSs were developed to help decision-makers solve their unstructured problems. So, we thus base our research off of the assumption that developing DSS, particularly for helping poorly structured or unstructured decisions, cannot be done without considering end-user decision problems, how to represent them collectively, decisions content, their meaning, and the decision-making process; thus, arise the field issues in a multidisciplinary perspective. Our approach addresses a problem-driven and collaborative approach to designing DSS technologies: It will reflect common end-user problems in the upstream design phase and in the downstream phase these problems will determine the design choices and potential technical solution. We will thus rely on a categorization of HR’s problems for a development mirroring the Analytics solution. This brings out a new data-driven DSS typology: Descriptive Analytics, Explicative or Diagnostic Analytics, Predictive Analytics, Prescriptive Analytics. In our research, identifying the problem takes place with design of the solution, so, we would have to resort a significant transformations of representations associated with the HR Analytics application to build an increasingly detailed representation of the goal to be achieved. Here, the collective cognition is reflected in the establishment of transfer functions of representations during the whole of the design process.

Keywords: DSS, collaborative design, problem-driven requirements, analytics application, HR decision making

Procedia PDF Downloads 295
18741 Effect of Migraine on Functional Performance and Reported Symptoms in Children with Concussion

Authors: Abdulaziz Alkathiry

Abstract:

Concussion is a common brain injury that affect physical and cognitive performance. While several studies indicated that adolescents are more likely to develop concussion, in the last decade concussion has been mainly explored in adults. Migraine has been identified as a common symptom reported after concussion and was tied with worse prognoses. Hence, we aimed to investigate the effect of migraine on functional performance and self-reported symptoms in children with concussion. This cross-sectional study involved 35 symptomatic children aged 9 – 17 years recruited within 1 year from their concussion injury at a tertiary balance center. Participants’ symptoms and functional performance were assessed using the post-concussion symptoms scale (PCSS) and the functional gait assessment (FGA) respectively. Concussed children with migraine showed significantly worse symptoms including fatigue, sleeping impairment, difficulty concentrating, and visual problems (P < 0.05). Functional performance didn’t show differences between concussed children with and without migraine. Although concussed children with and without migraine didn’t show any differences on functional performance, worse cognitive symptoms were found in concussed children with migraine. A customized treatment approach is indicated in the presence of migraine for the management of children with concussion. Keywords: Concussion; Migraine; Balance; Post-Concussion Symptoms Scale; Functional Gait Assessment

Keywords: concussion, migraine, post-concussion symptoms scale, functional gait assessment, balance

Procedia PDF Downloads 344
18740 The Log S-fbm Nested Factor Model

Authors: Othmane Zarhali, Cécilia Aubrun, Emmanuel Bacry, Jean-Philippe Bouchaud, Jean-François Muzy

Abstract:

The Nested factor model was introduced by Bouchaud and al., where the asset return fluctuations are explained by common factors representing the market economic sectors and residuals (noises) sharing with the factors a common dominant volatility mode in addition to the idiosyncratic mode proper to each residual. This construction infers that the factors-residuals log volatilities are correlated. Here, we consider the case of a single factor where the only dominant common mode is a S-fbm process (introduced by Peng, Bacry and Muzy) with Hurst exponent H around 0.11 and the residuals having in addition to the previous common mode idiosyncratic components with Hurst exponents H around 0. The reason for considering this configuration is twofold: preserve the Nested factor model’s characteristics introduced by Bouchaud and al. and propose a framework through which the stylized fact reported by Peng and al. is reproduced, where it has been observed that the Hurst exponents of stock indices are large as compared to those of individual stocks. In this work, we show that the Log S-fbm Nested factor model’s construction leads to a Hurst exponent of single stocks being the ones of the idiosyncratic volatility modes and the Hurst exponent of the index being the one of the common volatility modes. Furthermore, we propose a statistical procedure to estimate the Hurst factor exponent from the stock returns dynamics together with theoretical guarantees, with good results in the limit where the number of stocks N goes to infinity. Last but not least, we show that the factor can be seen as an index constructed from the single stocks weighted by specific coefficients.

Keywords: hurst exponent, log S-fbm model, nested factor model, small intermittency approximation

Procedia PDF Downloads 49
18739 Implementation of Free-Field Boundary Condition for 2D Site Response Analysis in OpenSees

Authors: M. Eskandarighadi, C. R. McGann

Abstract:

It is observed from past experiences of earthquakes that local site conditions can significantly affect the strong ground motion characteristics experience at the site. One-dimensional seismic site response analysis is the most common approach for investigating site response. This approach assumes that soil is homogeneous and infinitely extended in the horizontal direction. Therefore, tying side boundaries together is one way to model this behavior, as the wave passage is assumed to be only vertical. However, 1D analysis cannot capture the 2D nature of wave propagation, soil heterogeneity, and 2D soil profile with features such as inclined layer boundaries. In contrast, 2D seismic site response modeling can consider all of the mentioned factors to better understand local site effects on strong ground motions. 2D wave propagation and considering that the soil profile on the two sides of the model may not be identical clarifies the importance of a boundary condition on each side that can minimize the unwanted reflections from the edges of the model and input appropriate loading conditions. Ideally, the model size should be sufficiently large to minimize the wave reflection, however, due to computational limitations, increasing the model size is impractical in some cases. Another approach is to employ free-field boundary conditions that take into account the free-field motion that would exist far from the model domain and apply this to the sides of the model. This research focuses on implementing free-field boundary conditions in OpenSees for 2D site response analysisComparisons are made between 1D models and 2D models with various boundary conditions, and details and limitations of the developed free-field boundary modeling approach are discussed.

Keywords: boundary condition, free-field, opensees, site response analysis, wave propagation

Procedia PDF Downloads 158
18738 Regeneration of Geological Models Using Support Vector Machine Assisted by Principal Component Analysis

Authors: H. Jung, N. Kim, B. Kang, J. Choe

Abstract:

History matching is a crucial procedure for predicting reservoir performances and making future decisions. However, it is difficult due to uncertainties of initial reservoir models. Therefore, it is important to have reliable initial models for successful history matching of highly heterogeneous reservoirs such as channel reservoirs. In this paper, we proposed a novel scheme for regenerating geological models using support vector machine (SVM) and principal component analysis (PCA). First, we perform PCA for figuring out main geological characteristics of models. Through the procedure, permeability values of each model are transformed to new parameters by principal components, which have eigenvalues of large magnitude. Secondly, the parameters are projected into two-dimensional plane by multi-dimensional scaling (MDS) based on Euclidean distances. Finally, we train an SVM classifier using 20% models which show the most similar or dissimilar well oil production rates (WOPR) with the true values (10% for each). Then, the other 80% models are classified by trained SVM. We select models on side of low WOPR errors. One hundred channel reservoir models are initially generated by single normal equation simulation. By repeating the classification process, we can select models which have similar geological trend with the true reservoir model. The average field of the selected models is utilized as a probability map for regeneration. Newly generated models can preserve correct channel features and exclude wrong geological properties maintaining suitable uncertainty ranges. History matching with the initial models cannot provide trustworthy results. It fails to find out correct geological features of the true model. However, history matching with the regenerated ensemble offers reliable characterization results by figuring out proper channel trend. Furthermore, it gives dependable prediction of future performances with reduced uncertainties. We propose a novel classification scheme which integrates PCA, MDS, and SVM for regenerating reservoir models. The scheme can easily sort out reliable models which have similar channel trend with the reference in lowered dimension space.

Keywords: history matching, principal component analysis, reservoir modelling, support vector machine

Procedia PDF Downloads 160
18737 The Populist Rhetoric: The Symmetry of Environmentalism and Gandhianism in the Indian Mainstream Academia

Authors: Akanksha Indora

Abstract:

Environmental problems are considered a vital social issue in terms of the world’s problems with pollution, environmental degradation, and resource depletion. And populism is about appropriating the social issues according to the social condition for mobilizing a mass and constructing a ‘general will’. Populism encourages a move towards a common cause, it channelizes the emotions of the ‘common people’ towards a nation and nature. The Gandhian ideology has been received as a dominant ideology and the ‘only’ solution to environmental problems. This paper strives to understand the symmetry of environmentalism and Gandhianismi.e., how the debate on the environment in India has been primarily studied through the Gandhian ideology. The Indian Social Sciences visualize the broader issues of the environment from these perspectives, thus, making it a hegemonic approach. Being anti pluralist rhetoric is major rhetoric in the becoming of a populist. This paper shall focus on the idea that how this hegemonic construction of Gandhian ideology in the debates on environmentalism has contributed to the making of anti-pluralistic rhetoric. This anti-pluralistic rhetoric has eliminated the possibility of a pluralistic perspective in the debates on the environment. The quest for a moral inspiration embedded in Gandhianism, whose situatedness is found in the Hindu Social order, seems to have been completely rationalized through the larger politics of knowledge and thus making it appear as the only way forward when it is not.

Keywords: environmental populism, gandhianism, populist rhetoric, environmentalism

Procedia PDF Downloads 117
18736 HPPDFIM-HD: Transaction Distortion and Connected Perturbation Approach for Hierarchical Privacy Preserving Distributed Frequent Itemset Mining over Horizontally-Partitioned Dataset

Authors: Fuad Ali Mohammed Al-Yarimi

Abstract:

Many algorithms have been proposed to provide privacy preserving in data mining. These protocols are based on two main approaches named as: the perturbation approach and the Cryptographic approach. The first one is based on perturbation of the valuable information while the second one uses cryptographic techniques. The perturbation approach is much more efficient with reduced accuracy while the cryptographic approach can provide solutions with perfect accuracy. However, the cryptographic approach is a much slower method and requires considerable computation and communication overhead. In this paper, a new scalable protocol is proposed which combines the advantages of the perturbation and distortion along with cryptographic approach to perform privacy preserving in distributed frequent itemset mining on horizontally distributed data. Both the privacy and performance characteristics of the proposed protocol are studied empirically.

Keywords: anonymity data, data mining, distributed frequent itemset mining, gaussian perturbation, perturbation approach, privacy preserving data mining

Procedia PDF Downloads 505
18735 Towards End-To-End Disease Prediction from Raw Metagenomic Data

Authors: Maxence Queyrel, Edi Prifti, Alexandre Templier, Jean-Daniel Zucker

Abstract:

Analysis of the human microbiome using metagenomic sequencing data has demonstrated high ability in discriminating various human diseases. Raw metagenomic sequencing data require multiple complex and computationally heavy bioinformatics steps prior to data analysis. Such data contain millions of short sequences read from the fragmented DNA sequences and stored as fastq files. Conventional processing pipelines consist in multiple steps including quality control, filtering, alignment of sequences against genomic catalogs (genes, species, taxonomic levels, functional pathways, etc.). These pipelines are complex to use, time consuming and rely on a large number of parameters that often provide variability and impact the estimation of the microbiome elements. Training Deep Neural Networks directly from raw sequencing data is a promising approach to bypass some of the challenges associated with mainstream bioinformatics pipelines. Most of these methods use the concept of word and sentence embeddings that create a meaningful and numerical representation of DNA sequences, while extracting features and reducing the dimensionality of the data. In this paper we present an end-to-end approach that classifies patients into disease groups directly from raw metagenomic reads: metagenome2vec. This approach is composed of four steps (i) generating a vocabulary of k-mers and learning their numerical embeddings; (ii) learning DNA sequence (read) embeddings; (iii) identifying the genome from which the sequence is most likely to come and (iv) training a multiple instance learning classifier which predicts the phenotype based on the vector representation of the raw data. An attention mechanism is applied in the network so that the model can be interpreted, assigning a weight to the influence of the prediction for each genome. Using two public real-life data-sets as well a simulated one, we demonstrated that this original approach reaches high performance, comparable with the state-of-the-art methods applied directly on processed data though mainstream bioinformatics workflows. These results are encouraging for this proof of concept work. We believe that with further dedication, the DNN models have the potential to surpass mainstream bioinformatics workflows in disease classification tasks.

Keywords: deep learning, disease prediction, end-to-end machine learning, metagenomics, multiple instance learning, precision medicine

Procedia PDF Downloads 125
18734 Analytical Study of Applying the Account Aggregation Approach in E-Banking Services

Authors: A. Al Drees, A. Alahmari, R. Almuwayshir

Abstract:

The advanced information technology is becoming an important factor in the development of financial services industry, especially the banking industry. It has introduced new ways of delivering banking to the customer, such as Internet Banking. Banks began to look at electronic banking (e-banking) as a means to replace some of their traditional branch functions using the Internet as a new distribution channel. Some consumers have at least more than one account, and across banks, and access these accounts using e-banking services. To look at the current net worth position, customers have to login to each of their accounts and get the details and work on consolidation. This not only takes ample time but it is a repetitive activity at a specified frequency. To address this point, an account aggregation concept is added as a solution. E-banking account aggregation, as one of the e-banking types, appeared to build a stronger relationship with customers. Account Aggregation Service generally refers to a service that allows customers to manage their bank accounts maintained in different institutions through a common Internet banking operating a platform, with a high concern to security and privacy. This paper presents an overview of an e-banking account aggregation approach as a new service in the e-banking field.

Keywords: e-banking, account aggregation, security, enterprise development

Procedia PDF Downloads 327
18733 Capturing the Stress States in Video Conferences by Photoplethysmographic Pulse Detection

Authors: Jarek Krajewski, David Daxberger

Abstract:

We propose a stress detection method based on an RGB camera using heart rate detection, also known as Photoplethysmography Imaging (PPGI). This technique focuses on the measurement of the small changes in skin colour caused by blood perfusion. A stationary lab setting with simulated video conferences is chosen using constant light conditions and a sampling rate of 30 fps. The ground truth measurement of heart rate is conducted with a common PPG system. The proposed approach for pulse peak detection is based on a machine learning-based approach, applying brute force feature extraction for the prediction of heart rate pulses. The statistical analysis showed good agreement (correlation r = .79, p<0.05) between the reference heart rate system and the proposed method. Based on these findings, the proposed method could provide a reliable, low-cost, and contactless way of measuring HR parameters in daily-life environments.

Keywords: heart rate, PPGI, machine learning, brute force feature extraction

Procedia PDF Downloads 123
18732 Exploring the Synergistic Effects of Aerobic Exercise and Cinnamon Extract on Metabolic Markers in Insulin-Resistant Rats through Advanced Machine Learning and Deep Learning Techniques

Authors: Masoomeh Alsadat Mirshafaei

Abstract:

The present study aims to explore the effect of an 8-week aerobic training regimen combined with cinnamon extract on serum irisin and leptin levels in insulin-resistant rats. Additionally, this research leverages various machine learning (ML) and deep learning (DL) algorithms to model the complex interdependencies between exercise, nutrition, and metabolic markers, offering a groundbreaking approach to obesity and diabetes research. Forty-eight Wistar rats were selected and randomly divided into four groups: control, training, cinnamon, and training cinnamon. The training protocol was conducted over 8 weeks, with sessions 5 days a week at 75-80% VO2 max. The cinnamon and training-cinnamon groups were injected with 200 ml/kg/day of cinnamon extract. Data analysis included serum data, dietary intake, exercise intensity, and metabolic response variables, with blood samples collected 72 hours after the final training session. The dataset was analyzed using one-way ANOVA (P<0.05) and fed into various ML and DL models, including Support Vector Machines (SVM), Random Forest (RF), and Convolutional Neural Networks (CNN). Traditional statistical methods indicated that aerobic training, with and without cinnamon extract, significantly increased serum irisin and decreased leptin levels. Among the algorithms, the CNN model provided superior performance in identifying specific interactions between cinnamon extract concentration and exercise intensity, optimizing the increase in irisin and the decrease in leptin. The CNN model achieved an accuracy of 92%, outperforming the SVM (85%) and RF (88%) models in predicting the optimal conditions for metabolic marker improvements. The study demonstrated that advanced ML and DL techniques could uncover nuanced relationships and potential cellular responses to exercise and dietary supplements, which is not evident through traditional methods. These findings advocate for the integration of advanced analytical techniques in nutritional science and exercise physiology, paving the way for personalized health interventions in managing obesity and diabetes.

Keywords: aerobic training, cinnamon extract, insulin resistance, irisin, leptin, convolutional neural networks, exercise physiology, support vector machines, random forest

Procedia PDF Downloads 37
18731 Experiments on Weakly-Supervised Learning on Imperfect Data

Authors: Yan Cheng, Yijun Shao, James Rudolph, Charlene R. Weir, Beth Sahlmann, Qing Zeng-Treitler

Abstract:

Supervised predictive models require labeled data for training purposes. Complete and accurate labeled data, i.e., a ‘gold standard’, is not always available, and imperfectly labeled data may need to serve as an alternative. An important question is if the accuracy of the labeled data creates a performance ceiling for the trained model. In this study, we trained several models to recognize the presence of delirium in clinical documents using data with annotations that are not completely accurate (i.e., weakly-supervised learning). In the external evaluation, the support vector machine model with a linear kernel performed best, achieving an area under the curve of 89.3% and accuracy of 88%, surpassing the 80% accuracy of the training sample. We then generated a set of simulated data and carried out a series of experiments which demonstrated that models trained on imperfect data can (but do not always) outperform the accuracy of the training data, e.g., the area under the curve for some models is higher than 80% when trained on the data with an error rate of 40%. Our experiments also showed that the error resistance of linear modeling is associated with larger sample size, error type, and linearity of the data (all p-values < 0.001). In conclusion, this study sheds light on the usefulness of imperfect data in clinical research via weakly-supervised learning.

Keywords: weakly-supervised learning, support vector machine, prediction, delirium, simulation

Procedia PDF Downloads 198
18730 Computation of Natural Logarithm Using Abstract Chemical Reaction Networks

Authors: Iuliia Zarubiieva, Joyun Tseng, Vishwesh Kulkarni

Abstract:

Recent researches has focused on nucleic acids as a substrate for designing biomolecular circuits for in situ monitoring and control. A common approach is to express them by a set of idealised abstract chemical reaction networks (ACRNs). Here, we present new results on how abstract chemical reactions, viz., catalysis, annihilation and degradation, can be used to implement circuit that accurately computes logarithm function using the method of Arithmetic-Geometric Mean (AGM), which has not been previously used in conjunction with ACRNs.

Keywords: chemical reaction networks, ratio computation, stability, robustness

Procedia PDF Downloads 169
18729 Control Power in Doubly Fed Induction Generator Wind Turbine with SVM Control Inverter

Authors: Zerzouri Nora, Benalia Nadia, Bensiali Nadia

Abstract:

This paper presents a grid-connected wind power generation scheme using Doubly Fed Induction Generator (DFIG). This can supply power at constant voltage and constant frequency with the rotor speed varying. This makes it suitable for variable speed wind energy application. The DFIG system consists of wind turbine, asynchronous wound rotor induction generator, and inverter with Space Vector Modulation (SVM) controller. In which the stator is connected directly to the grid and the rotor winding is in interface with rotor converter and grid converter. The use of back-to-back SVM converter in the rotor circuit results in low distortion current, reactive power control and operate at variable speed. Mathematical modeling of the DFIG is done in order to analyze the performance of the systems and they are simulated using MATLAB. The simulation results for the system are obtained and hence it shows that the system can operate at variable speed with low harmonic current distortion. The objective is to track and extract maximum power from the wind energy system and transfer it to the grid for useful work.

Keywords: Doubly Fed Induction Generator, Wind Energy Conversion Systems, Space Vector Modulation, distortion harmonics

Procedia PDF Downloads 484
18728 Determinants of Aggregate Electricity Consumption in Ghana: A Multivariate Time Series Analysis

Authors: Renata Konadu

Abstract:

In Ghana, electricity has become the main form of energy which all sectors of the economy rely on for their businesses. Therefore, as the economy grows, the demand and consumption of electricity also grow alongside due to the heavy dependence on it. However, since the supply of electricity has not increased to match the demand, there has been frequent power outages and load shedding affecting business performances. To solve this problem and advance policies to secure electricity in Ghana, it is imperative that those factors that cause consumption to increase be analysed by considering the three classes of consumers; residential, industrial and non-residential. The main argument, however, is that, export of electricity to other neighbouring countries should be included in the electricity consumption model and considered as one of the significant factors which can decrease or increase consumption. The author made use of multivariate time series data from 1980-2010 and econometric models such as Ordinary Least Squares (OLS) and Vector Error Correction Model. Findings show that GDP growth, urban population growth, electricity exports and industry value added to GDP were cointegrated. The results also showed that there is unidirectional causality from electricity export and GDP growth and Industry value added to GDP to electricity consumption in the long run. However, in the short run, there was found to be a directional causality among all the variables and electricity consumption. The results have useful implication for energy policy makers especially with regards to electricity consumption, demand, and supply.

Keywords: electricity consumption, energy policy, GDP growth, vector error correction model

Procedia PDF Downloads 437
18727 Functional Gene Expression in Human Cells Using Linear Vectors Derived from Bacteriophage N15 Processing

Authors: Kumaran Narayanan, Pei-Sheng Liew

Abstract:

This paper adapts the bacteriophage N15 protelomerase enzyme to assemble linear chromosomes as vectors for gene expression in human cells. Phage N15 has the unique ability to replicate as a linear plasmid with telomeres in E. coli during its prophage stage of life-cycle. The virus-encoded protelomerase enzyme cuts its circular genome and caps its ends to form hairpin telomeres, resulting in a linear human-chromosome-like structure in E. coli. In mammalian cells, however, no enzyme with TelN-like activities has been found. In this work, we show for the first-time transfer of the protelomerase from phage into human and mouse cells and demonstrate recapitulation of its activity in these hosts. The function of this enzyme is assayed by demonstrating cleavage of its target DNA, followed by detecting telomere formation based on its resistance to recBCD enzyme digestion. We show protelomerase expression persists for at least 60 days, which indicates limited silencing of its expression. Next, we show that an intact human β-globin gene delivered on this linear chromosome accurately retains its expression in the human cellular environment for at least 60 hours, demonstrating its stability and potential as a vector. These results demonstrate that the N15 protelomerse is able to function in mammalian cells to cut and heal DNA to create telomeres, which provides a new tool for creating novel structures by DNA resolution in these hosts.

Keywords: chromosome, beta-globin, DNA, gene expression, linear vector

Procedia PDF Downloads 192
18726 Optimization of Hate Speech and Abusive Language Detection on Indonesian-language Twitter using Genetic Algorithms

Authors: Rikson Gultom

Abstract:

Hate Speech and Abusive language on social media is difficult to detect, usually, it is detected after it becomes viral in cyberspace, of course, it is too late for prevention. An early detection system that has a fairly good accuracy is needed so that it can reduce conflicts that occur in society caused by postings on social media that attack individuals, groups, and governments in Indonesia. The purpose of this study is to find an early detection model on Twitter social media using machine learning that has high accuracy from several machine learning methods studied. In this study, the support vector machine (SVM), Naïve Bayes (NB), and Random Forest Decision Tree (RFDT) methods were compared with the Support Vector machine with genetic algorithm (SVM-GA), Nave Bayes with genetic algorithm (NB-GA), and Random Forest Decision Tree with Genetic Algorithm (RFDT-GA). The study produced a comparison table for the accuracy of the hate speech and abusive language detection model, and presented it in the form of a graph of the accuracy of the six algorithms developed based on the Indonesian-language Twitter dataset, and concluded the best model with the highest accuracy.

Keywords: abusive language, hate speech, machine learning, optimization, social media

Procedia PDF Downloads 128
18725 Aerobic Biodegradation of a Chlorinated Hydrocarbon by Bacillus Cereus 2479

Authors: Srijata Mitra, Mobina Parveen, Pranab Roy, Narayan Chandra Chattopadhyay

Abstract:

Chlorinated hydrocarbon can be a major pollution problem in groundwater as well as soil. Many people interact with these chemicals on daily accidentally or by professionally in the laboratory. One of the most common sources for Chlorinated hydrocarbon contamination of soil and groundwater are industrial effluents. The wide use and discharge of Trichloroethylene (TCE), a volatile chlorohydrocarbon from chemical industry, led to major water pollution in rural areas. TCE is an mainly used as an industrial metal degreaser in industries. Biotransformation of TCE to the potent carcinogen vinyl chloride (VC) by consortia of anaerobic bacteria might have role for the above purpose. For these reasons, the aim of current study was to isolate and characterized the genes involved in TCE metabolism and also to investigate the in silico study of those genes. To our knowledge, only one aromatic dioxygenase system, the toluene dioxygenase in Pseudomonas putida F1 has been shown to be involved in TCE degradation. This is first instance where Bacillus cereus group being used in biodegradation of trichloroethylene. A novel bacterial strain 2479 was isolated from oil depot site at Rajbandh, Durgapur (West Bengal, India) by enrichment culture technique. It was identified based on polyphasic approach and ribotyping. The bacterium was gram positive, rod shaped, endospore forming and capable of degrading trichloroethylene as the sole carbon source. On the basis of phylogenetic data and Fatty Acid Methyl Ester Analysis, strain 2479 should be placed within the genus Bacillus and species cereus. However, the present isolate (strain 2479) is unique and sharply different from the usual Bacillus strains in its biodegrading nature. Fujiwara test was done to estimate that the strain 2479 could degrade TCE efficiently. The gene for TCE biodegradation was PCR amplified from genomic DNA of Bacillus cereus 2479 by using todC1 gene specific primers. The 600bp amplicon was cloned into expression vector pUC I8 in the E. coli host XL1-Blue and expressed under the control of lac promoter and nucleotide sequence was determined. The gene sequence was deposited at NCBI under the Accession no. GU183105. In Silico approach involved predicting the physico-chemical properties of deduced Tce1 protein by using ProtParam tool. The tce1 gene contained 342 bp long ORF encoding 114 amino acids with a predicted molecular weight 12.6 kDa and the theoretical pI value of the polypeptide was 5.17, molecular formula: C559H886N152O165S8, total number of atoms: 1770, aliphatic index: 101.93, instability index: 28.60, Grand Average of Hydropathicity (GRAVY): 0.152. Three differentially expressed proteins (97.1, 40 and 30 kDa) were directly involved in TCE biodegradation, found to react immunologically to the antibodies raised against TCE inducible proteins in Western blot analysis. The present study suggested that cloned gene product (TCE1) was capable of degrading TCE as verified chemically.

Keywords: cloning, Bacillus cereus, in silico analysis, TCE

Procedia PDF Downloads 397
18724 Hyper Tuned RBF SVM: Approach for the Prediction of the Breast Cancer

Authors: Surita Maini, Sanjay Dhanka

Abstract:

Machine learning (ML) involves developing algorithms and statistical models that enable computers to learn and make predictions or decisions based on data without being explicitly programmed. Because of its unlimited abilities ML is gaining popularity in medical sectors; Medical Imaging, Electronic Health Records, Genomic Data Analysis, Wearable Devices, Disease Outbreak Prediction, Disease Diagnosis, etc. In the last few decades, many researchers have tried to diagnose Breast Cancer (BC) using ML, because early detection of any disease can save millions of lives. Working in this direction, the authors have proposed a hybrid ML technique RBF SVM, to predict the BC in earlier the stage. The proposed method is implemented on the Breast Cancer UCI ML dataset with 569 instances and 32 attributes. The authors recorded performance metrics of the proposed model i.e., Accuracy 98.24%, Sensitivity 98.67%, Specificity 97.43%, F1 Score 98.67%, Precision 98.67%, and run time 0.044769 seconds. The proposed method is validated by K-Fold cross-validation.

Keywords: breast cancer, support vector classifier, machine learning, hyper parameter tunning

Procedia PDF Downloads 67
18723 DNA Prime/MVTT Boost Enhances Broadly Protective Immune Response against Mosaic HIV-1 Gag

Authors: Wan Liu, Haibo Wang, Cathy Huang, Zhiwu Tan, Zhiwei Chen

Abstract:

The tremendous diversity of HIV-1 has been a major challenge for an effective AIDS vaccine development. Mosaic approach presents the potential for vaccine design aiming for global protection. The mosaic antigen of HIV-1 Gag allows antigenic breadth for vaccine-elicited immune response against a wider spectrum of viral strains. However, the enhancement of immune response using vaccines is dependent on the strategy used. Heterologous prime/boost regimen has been shown to elicit high levels of immune responses. Here, we investigated whether priming using plasmid DNA with electroporation followed by boosting with the live replication-competent modified vaccinia virus vector TianTan (MVTT) combined with the mosaic antigenic sequence could elicit a greater and broader antigen-specific response against HIV-1 Gag in mice. When compared to DNA or MVTT alone, or MVTT/MVTT group, DNA/MVTT group resulted in coincidentally high frequencies of broadly reactive, Gag-specific, polyfunctional, long-lived, and cytotoxic CD8+ T cells and increased anti-Gag antibody titer. Meanwhile, the vaccination could upregulate PD-1+, and Tim-3+ CD8+ T cell, myeloid-derived suppressive cells and Treg cells to balance the stronger immune response induced. Importantly, the prime/boost vaccination could help control the EcoHIV and mesothelioma AB1-gag challenge. The stronger protective Gag-specific immunity induced by a Mosaic DNA/MVTT vaccine corroborate the promise of the mosaic approach, and the potential of two acceptably safe vectors to enhance anti-HIV immunity and cancer prevention.

Keywords: DNA/MVTT vaccine, EcoHIV, mosaic antigen, mesothelioma AB1-gag

Procedia PDF Downloads 242