Search results for: search algorithms
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3773

Search results for: search algorithms

2663 Adaptive Power Control of the City Bus Integrated Photovoltaic System

Authors: Piotr Kacejko, Mariusz Duk, Miroslaw Wendeker

Abstract:

This paper presents an adaptive controller to track the maximum power point of a photovoltaic modules (PV) under fast irradiation change on the city-bus roof. Photovoltaic systems have been a prominent option as an additional energy source for vehicles. The Municipal Transport Company (MPK) in Lublin has installed photovoltaic panels on its buses roofs. The solar panels turn solar energy into electric energy and are used to load the buses electric equipment. This decreases the buses alternators load, leading to lower fuel consumption and bringing both economic and ecological profits. A DC–DC boost converter is selected as the power conditioning unit to coordinate the operating point of the system. In addition to the conversion efficiency of a photovoltaic panel, the maximum power point tracking (MPPT) method also plays a main role to harvest most energy out of the sun. The MPPT unit on a moving vehicle must keep tracking accuracy high in order to compensate rapid change of irradiation change due to dynamic motion of the vehicle. Maximum power point track controllers should be used to increase efficiency and power output of solar panels under changing environmental factors. There are several different control algorithms in the literature developed for maximum power point tracking. However, energy performances of MPPT algorithms are not clarified for vehicle applications that cause rapid changes of environmental factors. In this study, an adaptive MPPT algorithm is examined at real ambient conditions. PV modules are mounted on a moving city bus designed to test the solar systems on a moving vehicle. Some problems of a PV system associated with a moving vehicle are addressed. The proposed algorithm uses a scanning technique to determine the maximum power delivering capacity of the panel at a given operating condition and controls the PV panel. The aim of control algorithm was matching the impedance of the PV modules by controlling the duty cycle of the internal switch, regardless of changes of the parameters of the object of control and its outer environment. Presented algorithm was capable of reaching the aim of control. The structure of an adaptive controller was simplified on purpose. Since such a simple controller, armed only with an ability to learn, a more complex structure of an algorithm can only improve the result. The presented adaptive control system of the PV system is a general solution and can be used for other types of PV systems of both high and low power. Experimental results obtained from comparison of algorithms by a motion loop are presented and discussed. Experimental results are presented for fast change in irradiation and partial shading conditions. The results obtained clearly show that the proposed method is simple to implement with minimum tracking time and high tracking efficiency proving superior to the proposed method. This work has been financed by the Polish National Centre for Research and Development, PBS, under Grant Agreement No. PBS 2/A6/16/2013.

Keywords: adaptive control, photovoltaic energy, city bus electric load, DC-DC converter

Procedia PDF Downloads 212
2662 Sensor Network Routing Optimization by Simulating Eurygaster Life in Wheat Farms

Authors: Fariborz Ahmadi, Hamid Salehi, Khosrow Karimi

Abstract:

A sensor network is set of sensor nodes that cooperate together to perform a predefined tasks. The important problem in this network is power consumption. So, in this paper one algorithm based on the eurygaster life is introduced to minimize power consumption by the nodes of these networks. In this method the search space of problem is divided into several partitions and each partition is investigated separately. The evaluation results show that our approach is more efficient in comparison to other evolutionary algorithm like genetic algorithm.

Keywords: evolutionary computation, genetic algorithm, particle swarm optimization, sensor network optimization

Procedia PDF Downloads 428
2661 Towards Competence-Based Regulatory Sciences Education in Sub-Saharan Africa: Identification of Competencies

Authors: Abigail Ekeigwe, Bethany McGowan, Loran C. Parker, Stephen Byrn, Kari L. Clase

Abstract:

There are growing calls in the literature to develop and implement competency-based regulatory sciences education (CBRSE) in sub-Saharan Africa to expand and create a pipeline of a competent workforce of regulatory scientists. A defined competence framework is an essential component in developing competency-based education. However, such a competence framework is not available for regulatory scientists in sub-Saharan Africa. The purpose of this research is to identify entry-level competencies for inclusion in a competency framework for regulatory scientists in sub-Saharan Africa as a first step in developing CBRSE. The team systematically reviewed the literature following the PRISMA guidelines for systematic reviews and based on a pre-registered protocol on Open Science Framework (OSF). The protocol has the search strategy and the inclusion and exclusion criteria for publications. All included publications were coded to identify entry-level competencies for regulatory scientists. The team deductively coded the publications included in the study using the 'framework synthesis' model for systematic literature review. The World Health Organization’s conceptualization of competence guided the review and thematic synthesis. Topic and thematic codings were done using NVivo 12™ software. Based on the search strategy in the protocol, 2345 publications were retrieved. Twenty-two (n=22) of the retrieved publications met all the inclusion criteria for the research. Topic and thematic coding of the publications yielded three main domains of competence: knowledge, skills, and enabling behaviors. The knowledge domain has three sub-domains: administrative, regulatory governance/framework, and scientific knowledge. The skills domain has two sub-domains: functional and technical skills. Identification of competencies is the primal step that serves as a bedrock for curriculum development and competency-based education. The competencies identified in this research will help policymakers, educators, institutions, and international development partners design and implement competence-based regulatory science education in sub-Saharan Africa, ultimately leading to access to safe, quality, and effective medical products.

Keywords: competence-based regulatory science education, competencies, systematic review, sub-Saharan Africa

Procedia PDF Downloads 197
2660 Study of Antibacterial Activity of Phenolic Compounds Extracted from Algerian Medicinal Plant

Authors: Khadri Sihem, Abbaci Nafissa, Zerari Labiba

Abstract:

In the context of the search for new bioactive natural products, we were interested in evaluating some antibacterial properties of two plant extracts: total phenols and flavonoids of Algerian medicinal plant. Our study occurs in two axes: The first concerns the extraction of phenolic compounds and flavonoids with methanol by liquid-liquid extraction, followed by quantification of the levels of these compounds in the end the analysis of the chemical composition of extracts. In the second axis, we studied the antibacterial power of the studied plant extracts.

Keywords: antibacterial activity, flavonoids, medicinal plants, polyphenols

Procedia PDF Downloads 554
2659 Preparation Control Information and Analyzing of Metering Gas System Based of Orifice Plate

Authors: A. Harrouz, A. Benatiallah, O. Harrouz

Abstract:

This paper presents the search for errors in the measurement instruments in a dynamic system of metering liquid or gas and sees the tolerance defined by the international standards and recommendations. We will implement a program on MATLAB/Simulink which is calculated based on the ISO-5167. This program will take the system parameters on considerations such as: the willingness plates, the size of the orifice, the given design conditions, reference conditions, find pressure drop for a given flow, or flow for a loss of given load. The results are considered very good and satisfactory because the errors identified of measuring instruments system are within the margin of error limit by the regulations.

Keywords: analyzing, control, gas, meters system

Procedia PDF Downloads 399
2658 Unravelling the Knot: Towards a Definition of ‘Digital Labor’

Authors: Marta D'Onofrio

Abstract:

The debate on the digitalization of the economy has raised questions about how both labor and the regulation of work processes are changing due to the introduction of digital technologies in the productive system. Within the literature, the term ‘digital labor’ is commonly used to identify the impact of digitalization on labor. Despite the wide use of this term, it is still not available an unambiguous definition of it, and this could create confusion in the use of terminology and in the attempts of classification. As a consequence, the purpose of this paper is to provide for a definition and to propose a classification of ‘digital labor’, resorting to the theoretical approach of organizational studies.

Keywords: digital labor, digitalization, data-driven algorithms, big data, organizational studies

Procedia PDF Downloads 153
2657 Using Mind Mapping and Morphological Analysis within a New Methodology for Teaching Students of Products’ Design

Authors: Kareem Saber

Abstract:

Many products’ design instructors search for how to help students to develop their designs simply by reducing design stages and extrapolating simple design process forms to achieve design creativity. So, the researcher extrapolated a new design process form called “hierarchical design” which reduced design process into three stages and he had tried that methodology on about two hundred students. That trial had led to great results as students could develop their designs which characterized by creativity and innovation. That proved the success and effectiveness of the proposed methodology.

Keywords: mind mapping, morphological analysis, product design, design process

Procedia PDF Downloads 178
2656 Krill-Herd Step-Up Approach Based Energy Efficiency Enhancement Opportunities in the Offshore Mixed Refrigerant Natural Gas Liquefaction Process

Authors: Kinza Qadeer, Muhammad Abdul Qyyum, Moonyong Lee

Abstract:

Natural gas has become an attractive energy source in comparison with other fossil fuels because of its lower CO₂ and other air pollutant emissions. Therefore, compared to the demand for coal and oil, that for natural gas is increasing rapidly world-wide. The transportation of natural gas over long distances as a liquid (LNG) preferable for several reasons, including economic, technical, political, and safety factors. However, LNG production is an energy-intensive process due to the tremendous amount of power requirements for compression of refrigerants, which provide sufficient cold energy to liquefy natural gas. Therefore, one of the major issues in the LNG industry is to improve the energy efficiency of existing LNG processes through a cost-effective approach that is 'optimization'. In this context, a bio-inspired Krill-herd (KH) step-up approach was examined to enhance the energy efficiency of a single mixed refrigerant (SMR) natural gas liquefaction (LNG) process, which is considered as a most promising candidate for offshore LNG production (FPSO). The optimal design of a natural gas liquefaction processes involves multivariable non-linear thermodynamic interactions, which lead to exergy destruction and contribute to process irreversibility. As key decision variables, the optimal values of mixed refrigerant flow rates and process operating pressures were determined based on the herding behavior of krill individuals corresponding to the minimum energy consumption for LNG production. To perform the rigorous process analysis, the SMR process was simulated in Aspen Hysys® software and the resulting model was connected with the Krill-herd approach coded in MATLAB. The optimal operating conditions found by the proposed approach significantly reduced the overall energy consumption of the SMR process by ≤ 22.5% and also improved the coefficient of performance in comparison with the base case. The proposed approach was also compared with other well-proven optimization algorithms, such as genetic and particle swarm optimization algorithms, and was found to exhibit a superior performance over these existing approaches.

Keywords: energy efficiency, Krill-herd, LNG, optimization, single mixed refrigerant

Procedia PDF Downloads 155
2655 The Perspective of Waste Frying Oil in São Paulo and Its Dimensions in the Reverse Logistics of the Production of Biodiesel

Authors: Max Filipe Goncalves, Alessandra Concilio, Rodrigo Shimada

Abstract:

The waste frying oil is highly pollutant when disposed incorrectly in the environment. Is necessary search of the Reverse Logistics to identify how can be structure to return the waste like this to productive chain and to be used in the new process. In this context, the objective of this paper is to analyze the perspective of the waste frying oil in São Paulo, and its dimensions in the production of biodiesel. Subjacent factors such as the agents, motivators and legal aspects were analyzed to demonstrate it. Then, the SWOT matrix was built with the aspects observed and the forces, weaknesses, opportunities and threats of the reverse logistic chain in São Paulo.

Keywords: biodiesel, perspective, reverse logistic, WFO

Procedia PDF Downloads 209
2654 Adaptive Process Monitoring for Time-Varying Situations Using Statistical Learning Algorithms

Authors: Seulki Lee, Seoung Bum Kim

Abstract:

Statistical process control (SPC) is a practical and effective method for quality control. The most important and widely used technique in SPC is a control chart. The main goal of a control chart is to detect any assignable changes that affect the quality output. Most conventional control charts, such as Hotelling’s T2 charts, are commonly based on the assumption that the quality characteristics follow a multivariate normal distribution. However, in modern complicated manufacturing systems, appropriate control chart techniques that can efficiently handle the nonnormal processes are required. To overcome the shortcomings of conventional control charts for nonnormal processes, several methods have been proposed to combine statistical learning algorithms and multivariate control charts. Statistical learning-based control charts, such as support vector data description (SVDD)-based charts, k-nearest neighbors-based charts, have proven their improved performance in nonnormal situations compared to that of the T2 chart. Beside the nonnormal property, time-varying operations are also quite common in real manufacturing fields because of various factors such as product and set-point changes, seasonal variations, catalyst degradation, and sensor drifting. However, traditional control charts cannot accommodate future condition changes of the process because they are formulated based on the data information recorded in the early stage of the process. In the present paper, we propose a SVDD algorithm-based control chart, which is capable of adaptively monitoring time-varying and nonnormal processes. We reformulated the SVDD algorithm into a time-adaptive SVDD algorithm by adding a weighting factor that reflects time-varying situations. Moreover, we defined the updating region for the efficient model-updating structure of the control chart. The proposed control chart simultaneously allows efficient model updates and timely detection of out-of-control signals. The effectiveness and applicability of the proposed chart were demonstrated through experiments with the simulated data and the real data from the metal frame process in mobile device manufacturing.

Keywords: multivariate control chart, nonparametric method, support vector data description, time-varying process

Procedia PDF Downloads 299
2653 A Comparative Study of Optimization Techniques and Models to Forecasting Dengue Fever

Authors: Sudha T., Naveen C.

Abstract:

Dengue is a serious public health issue that causes significant annual economic and welfare burdens on nations. However, enhanced optimization techniques and quantitative modeling approaches can predict the incidence of dengue. By advocating for a data-driven approach, public health officials can make informed decisions, thereby improving the overall effectiveness of sudden disease outbreak control efforts. The National Oceanic and Atmospheric Administration and the Centers for Disease Control and Prevention are two of the U.S. Federal Government agencies from which this study uses environmental data. Based on environmental data that describe changes in temperature, precipitation, vegetation, and other factors known to affect dengue incidence, many predictive models are constructed that use different machine learning methods to estimate weekly dengue cases. The first step involves preparing the data, which includes handling outliers and missing values to make sure the data is prepared for subsequent processing and the creation of an accurate forecasting model. In the second phase, multiple feature selection procedures are applied using various machine learning models and optimization techniques. During the third phase of the research, machine learning models like the Huber Regressor, Support Vector Machine, Gradient Boosting Regressor (GBR), and Support Vector Regressor (SVR) are compared with several optimization techniques for feature selection, such as Harmony Search and Genetic Algorithm. In the fourth stage, the model's performance is evaluated using Mean Square Error (MSE), Mean Absolute Error (MAE), and Root Mean Square Error (RMSE) as assistance. Selecting an optimization strategy with the least number of errors, lowest price, biggest productivity, or maximum potential results is the goal. In a variety of industries, including engineering, science, management, mathematics, finance, and medicine, optimization is widely employed. An effective optimization method based on harmony search and an integrated genetic algorithm is introduced for input feature selection, and it shows an important improvement in the model's predictive accuracy. The predictive models with Huber Regressor as the foundation perform the best for optimization and also prediction.

Keywords: deep learning model, dengue fever, prediction, optimization

Procedia PDF Downloads 65
2652 Pregnancy and Birth Outcomes of Single versus Multiple Embryo Transfer in Gestational Surrogacy Arrangements: A Systematic Review

Authors: Jutharat Attawet, Alex Y. Wang, Cindy M. Farquhar, Elizabeth A. Sullivan

Abstract:

Background: Adverse maternal and perinatal outcomes of multiple pregnancies resulting from multiple embryo transfers (ET) has become significant concerns. This is particularly relevant for gestational carriers since they usually do not have infertility issues. Single embryo transfer (SET) therefore has been encouraged to assist reproductive technology (ART) practice in order to reduce multiple pregnancies. Objectives: This systematic review aims to investigate the pregnancy and birth outcomes of SET and multiple ET in surrogacy arrangements. Search methods: This study is a systematic review. Electronic databases were searched from CINAHL, Medline, Embase, Scopus and ProQuest for studies from 1980 to 2017. Cross-references and national ART reports were also manual searchings. Articles without restriction of English language and study types were accessed. Carrier cycles involving in SET and multiple ET were identified in database searching. The main outcome measures including clinical pregnancy, live delivery and multiple deliveries per gestational carrier cycle were compared between SET and multiple ET. Mantel-Haenzel risk ratios (RRs) with 95% confidence intervals (CIs), using the numbers of outcome events in SET and multiple ET of each study were calculated suing RevMan5.3. Outcomes: The search returned 97 articles of which 5 met the inclusion criteria. Approximately 50% of carrier cycles were transferred a single embryo and 50% were transferred more than one embryo. The clinical pregnancy rate (CPR) was 39% for SET and 53% for multiple ET, which was not significantly different with RR = 0.83 (95% CI: 0.67-1.03). The live delivery rate was 33% for SET and 57% for multiple ET which was not significantly different with RR = 0.78 (95% CI: 0.61-1.00). The multiple delivery rate per carrier was greater risks in the multiple ET carrier cycles (RR =0.4, 95% CI: 0.01-0.26). There were 104 sets of twins (including one set of twins selectively reduced from triplets to twins) and 1 set of triples in the multiple ET carrier cycle. In the SET carrier cycles, there were 2 sets of twins. Significance of the study: SET should be advocated among surrogate carriers to prevent multiple pregnancies and subsequent adverse outcomes for both carrier and baby. Surrogacy practice should be reviewed and surrogate carriers should be fully informed of the risk of adverse maternal and birth outcome of multiple pregnancies due to multiple embryo transfers.

Keywords: assisted reproduction, birth outcomes, carrier, gestational surrogacy, multiple embryo transfer, multiple pregnancy, pregnancy outcomes, single embryo transfer, surrogate mother, systematic review

Procedia PDF Downloads 404
2651 Improved Computational Efficiency of Machine Learning Algorithm Based on Evaluation Metrics to Control the Spread of Coronavirus in the UK

Authors: Swathi Ganesan, Nalinda Somasiri, Rebecca Jeyavadhanam, Gayathri Karthick

Abstract:

The COVID-19 crisis presents a substantial and critical hazard to worldwide health. Since the occurrence of the disease in late January 2020 in the UK, the number of infected people confirmed to acquire the illness has increased tremendously across the country, and the number of individuals affected is undoubtedly considerably high. The purpose of this research is to figure out a predictive machine learning archetypal that could forecast COVID-19 cases within the UK. This study concentrates on the statistical data collected from 31st January 2020 to 31st March 2021 in the United Kingdom. Information on total COVID cases registered, new cases encountered on a daily basis, total death registered, and patients’ death per day due to Coronavirus is collected from World Health Organisation (WHO). Data preprocessing is carried out to identify any missing values, outliers, or anomalies in the dataset. The data is split into 8:2 ratio for training and testing purposes to forecast future new COVID cases. Support Vector Machines (SVM), Random Forests, and linear regression algorithms are chosen to study the model performance in the prediction of new COVID-19 cases. From the evaluation metrics such as r-squared value and mean squared error, the statistical performance of the model in predicting the new COVID cases is evaluated. Random Forest outperformed the other two Machine Learning algorithms with a training accuracy of 99.47% and testing accuracy of 98.26% when n=30. The mean square error obtained for Random Forest is 4.05e11, which is lesser compared to the other predictive models used for this study. From the experimental analysis Random Forest algorithm can perform more effectively and efficiently in predicting the new COVID cases, which could help the health sector to take relevant control measures for the spread of the virus.

Keywords: COVID-19, machine learning, supervised learning, unsupervised learning, linear regression, support vector machine, random forest

Procedia PDF Downloads 121
2650 Development of Peptide Inhibitors against Dengue Virus Infection by in Silico Design

Authors: Aussara Panya, Nunghathai Sawasdee, Mutita Junking, Chatchawan Srisawat, Kiattawee Choowongkomon, Pa-Thai Yenchitsomanus

Abstract:

Dengue virus (DENV) infection is a global public health problem with approximately 100 million infected cases a year. Presently, there is no approved vaccine or effective drug available; therefore, the development of anti-DENV drug is urgently needed. The clinical reports revealing the positive association between the disease severity and viral titer has been reported previously suggesting that the anti-DENV drug therapy can possibly ameliorate the disease severity. Although several anti-DENV agents showed inhibitory activities against DENV infection, to date none of them accomplishes clinical use in the patients. The surface envelope (E) protein of DENV is critical for the viral entry step, which includes attachment and membrane fusion; thus, the blocking of envelope protein is an attractive strategy for anti-DENV drug development. To search the safe anti-DENV agent, this study aimed to search for novel peptide inhibitors to counter DENV infection through the targeting of E protein using a structure-based in silico design. Two selected strategies has been used including to identify the peptide inhibitor which interfere the membrane fusion process whereby the hydrophobic pocket on the E protein was the target, the destabilization of virion structure organization through the disruption of the interaction between the envelope and membrane proteins, respectively. The molecular docking technique has been used in the first strategy to search for the peptide inhibitors that specifically bind to the hydrophobic pocket. The second strategy, the peptide inhibitor has been designed to mimic the ectodomain portion of membrane protein to disrupt the protein-protein interaction. The designed peptides were tested for the effects on cell viability to measure the toxic to peptide to the cells and their inhibitory assay to inhibit the DENV infection in Vero cells. Furthermore, their antiviral effects on viral replication, intracellular protein level and viral production have been observed by using the qPCR, cell-based flavivirus immunodetection and immunofluorescence assay. None of tested peptides showed the significant effect on cell viability. The small peptide inhibitors achieved from molecular docking, Glu-Phe (EF), effectively inhibited DENV infection in cell culture system. Its most potential effect was observed for DENV2 with a half maximal inhibition concentration (IC50) of 96 μM, but it partially inhibited other serotypes. Treatment of EF at 200 µM on infected cells also significantly reduced the viral genome and protein to 83.47% and 84.15%, respectively, corresponding to the reduction of infected cell numbers. An additional approach was carried out by using peptide mimicking membrane (M) protein, namely MLH40. Treatment of MLH40 caused the reduction of foci formation in four individual DENV serotype (DENV1-4) with IC50 of 24-31 μM. Further characterization suggested that the MLH40 specifically blocked viral attachment to host membrane, and treatment with 100 μM could diminish 80% of viral attachment. In summary, targeting the hydrophobic pocket and M-binding site on the E protein by using the peptide inhibitors could inhibit DENV infection. The results provide proof of-concept for the development of antiviral therapeutic peptide inhibitors to counter DENV infection through the use of a structure-based design targeting conserved viral protein.

Keywords: dengue virus, dengue virus infection, drug design, peptide inhibitor

Procedia PDF Downloads 357
2649 Automatic Intelligent Analysis of Malware Behaviour

Authors: Hermann Dornhackl, Konstantin Kadletz, Robert Luh, Paul Tavolato

Abstract:

In this paper we describe the use of formal methods to model malware behaviour. The modelling of harmful behaviour rests upon syntactic structures that represent malicious procedures inside malware. The malicious activities are modelled by a formal grammar, where API calls’ components are the terminals and the set of API calls used in combination to achieve a goal are designated non-terminals. The combination of different non-terminals in various ways and tiers make up the attack vectors that are used by harmful software. Based on these syntactic structures a parser can be generated which takes execution traces as input for pattern recognition.

Keywords: malware behaviour, modelling, parsing, search, pattern matching

Procedia PDF Downloads 332
2648 Facilitating Primary Care Practitioners to Improve Outcomes for People With Oropharyngeal Dysphagia Living in the Community: An Ongoing Realist Review

Authors: Caroline Smith, Professor Debi Bhattacharya, Sion Scott

Abstract:

Introduction: Oropharyngeal Dysphagia (OD) effects around 15% of older people, however it is often unrecognised and under diagnosed until they are hospitalised. There is a need for primary care healthcare practitioners (HCPs) to assume a proactive role in identifying and managing OD to prevent adverse outcomes such as aspiration pneumonia. Understanding the determinants of primary care HCPs undertaking this new behaviour provides the intervention targets for addressing. This realist review, underpinned by the Theoretical Domains Framework (TDF), aims to synthesise relevant literature and develop programme theories to understand what interventions work, how they work and under what circumstances to facilitate HCPs to prevent harm from OD. Combining realist methodology with behavioural science will permit conceptualisation of intervention components as theoretical behavioural constructs, thus informing the design of a future behaviour change intervention. Furthermore, through the TDF’s linkage to a taxonomy of behaviour change techniques, we will identify corresponding behaviour change techniques to include in this intervention. Methods & analysis: We are following the five steps for undertaking a realist review: 1) clarify the scope 2) Literature search 3) appraise and extract data 4) evidence synthesis 5) evaluation. We have searched Medline, Google scholar, PubMed, EMBASE, CINAHL, AMED, Scopus and PsycINFO databases. We are obtaining additional evidence through grey literature, snowball sampling, lateral searching and consulting the stakeholder group. Literature is being screened, evaluated and synthesised in Excel and Nvivo. We will appraise evidence in relation to its relevance and rigour. Data will be extracted and synthesised according to its relation to Initial programme theories (IPTs). IPTs were constructed after the preliminary literature search, informed by the TDF and with input from a stakeholder group of patient and public involvement advisors, general practitioners, speech and language therapists, geriatricians and pharmacists. We will follow the Realist and Meta-narrative Evidence Syntheses: Evolving Standards (RAMESES) quality and publication standards to report study results. Results: In this ongoing review our search has identified 1417 manuscripts with approximately 20% progressing to full text screening. We inductively generated 10 IPTs that hypothesise practitioners require: the knowledge to spot the signs and symptoms of OD; the skills to provide initial advice and support; and access to resources in their working environment to support them conducting these new behaviours. We mapped the 10 IPTs to 8 TDF domains and then generated a further 12 IPTs deductively using domain definitions to fulfil the remaining 6 TDF domains. Deductively generated IPTs broadened our thinking to consider domains such as ‘Emotion,’ ‘Optimism’ and ‘Social Influence’, e.g. If practitioners perceive that patients, carers and relatives expect initial advice and support, then they will be more likely to provide this, because they will feel obligated to do so. After prioritisation with stakeholders using a modified nominal group technique approach, a maximum of 10 IPTs will progress to test against the literature.

Keywords: behaviour change, deglutition disorders, primary healthcare, realist review

Procedia PDF Downloads 85
2647 Unleashing the Power of Cerebrospinal System for a Better Computer Architecture

Authors: Lakshmi N. Reddi, Akanksha Varma Sagi

Abstract:

Studies on biomimetics are largely developed, deriving inspiration from natural processes in our objective world to develop novel technologies. Recent studies are diverse in nature, making their categorization quite challenging. Based on an exhaustive survey, we developed categorizations based on either the essential elements of nature - air, water, land, fire, and space, or on form/shape, functionality, and process. Such diverse studies as aircraft wings inspired by bird wings, a self-cleaning coating inspired by a lotus petal, wetsuits inspired by beaver fur, and search algorithms inspired by arboreal ant path networks lend themselves to these categorizations. Our categorizations of biomimetic studies allowed us to define a different dimension of biomimetics. This new dimension is not restricted to inspiration from the objective world. It is based on the premise that the biological processes observed in the objective world find their reflections in our human bodies in a variety of ways. For example, the lungs provide the most efficient example for liquid-gas phase exchange, the heart exemplifies a very efficient pumping and circulatory system, and the kidneys epitomize the most effective cleaning system. The main focus of this paper is to bring out the magnificence of the cerebro-spinal system (CSS) insofar as it relates to our current computer architecture. In particular, the paper uses four key measures to analyze the differences between CSS and human- engineered computational systems. These are adaptability, sustainability, energy efficiency, and resilience. We found that the cerebrospinal system reveals some important challenges in the development and evolution of our current computer architectures. In particular, the myriad ways in which the CSS is integrated with other systems/processes (circulatory, respiration, etc) offer useful insights on how the human-engineered computational systems could be made more sustainable, energy-efficient, resilient, and adaptable. In our paper, we highlight the energy consumption differences between CSS and our current computational designs. Apart from the obvious differences in materials used between the two, the systemic nature of how CSS functions provides clues to enhance life-cycles of our current computational systems. The rapid formation and changes in the physiology of dendritic spines and their synaptic plasticity causing memory changes (ex., long-term potentiation and long-term depression) allowed us to formulate differences in the adaptability and resilience of CSS. In addition, the CSS is sustained by integrative functions of various organs, and its robustness comes from its interdependence with the circulatory system. The paper documents and analyzes quantifiable differences between the two in terms of the four measures. Our analyses point out the possibilities in the development of computational systems that are more adaptable, sustainable, energy efficient, and resilient. It concludes with the potential approaches for technological advancement through creation of more interconnected and interdependent systems to replicate the effective operation of cerebro-spinal system.

Keywords: cerebrospinal system, computer architecture, adaptability, sustainability, resilience, energy efficiency

Procedia PDF Downloads 99
2646 Detecting Covid-19 Fake News Using Deep Learning Technique

Authors: AnjalI A. Prasad

Abstract:

Nowadays, social media played an important role in spreading misinformation or fake news. This study analyzes the fake news related to the COVID-19 pandemic spread in social media. This paper aims at evaluating and comparing different approaches that are used to mitigate this issue, including popular deep learning approaches, such as CNN, RNN, LSTM, and BERT algorithm for classification. To evaluate models’ performance, we used accuracy, precision, recall, and F1-score as the evaluation metrics. And finally, compare which algorithm shows better result among the four algorithms.

Keywords: BERT, CNN, LSTM, RNN

Procedia PDF Downloads 206
2645 An Architectural Approach for the Dynamic Adaptation of Services-Based Software

Authors: Mohhamed Yassine Baroudi, Abdelkrim Benammar, Fethi Tarik Bendimerad

Abstract:

This paper proposes software architecture for dynamical service adaptation. The services are constituted by reusable software components. The adaptation’s goal is to optimize the service function of their execution context. For a first step, the context will take into account just the user needs but other elements will be added. A particular feature in our proposition is the profiles that are used not only to describe the context’s elements but also the components itself. An adapter analyzes the compatibility between all these profiles and detects the points where the profiles are not compatibles. The same Adapter search and apply the possible adaptation solutions: component customization, insertion, extraction or replacement.

Keywords: adaptative service, software component, service, dynamic adaptation

Procedia PDF Downloads 298
2644 Adaptive CFAR Analysis for Non-Gaussian Distribution

Authors: Bouchemha Amel, Chachoui Takieddine, H. Maalem

Abstract:

Automatic detection of targets in a modern communication system RADAR is based primarily on the concept of adaptive CFAR detector. To have an effective detection, we must minimize the influence of disturbances due to the clutter. The detection algorithm adapts the CFAR detection threshold which is proportional to the average power of the clutter, maintaining a constant probability of false alarm. In this article, we analyze the performance of two variants of adaptive algorithms CA-CFAR and OS-CFAR and we compare the thresholds of these detectors in the marine environment (no-Gaussian) with a Weibull distribution.

Keywords: CFAR, threshold, clutter, distribution, Weibull, detection

Procedia PDF Downloads 589
2643 Factors Associated with Risky Sexual Behaviour in Adolescent Girls and Young Women in Cambodia: A Systematic Review

Authors: Farwa Rizvi, Joanne Williams, Humaira Maheen, Elizabeth Hoban

Abstract:

There is an increase in risky sexual behavior and unsafe sex in adolescent girls and young women aged 15 to 24 years in Cambodia, which negatively affects their reproductive health by increasing the risk of contracting sexually transmitted infections and unintended pregnancies. Risky sexual behavior includes ‘having sex at an early age, having multiple sexual partners, having sex while under the influence of alcohol or drugs, and unprotected sexual behaviors’. A systematic review of quantitative research conducted in Cambodia was undertaken, using the theoretical framework of the Social Ecological Model to identify the personal, social and cultural factors associated with risky sexual behavior and unsafe sex in young Cambodian women. PRISMA guidelines were used to search databases including Medline Complete, PsycINFO, CINAHL Complete, Academic Search Complete, Global Health, and Social Work Abstracts. Additional searches were conducted in Science Direct, Google Scholar and in the grey literature sources. A risk-of-bias tool developed explicitly for the systematic review of cross-sectional studies was used. Summary item on the overall risk of study bias after the inter-rater response showed that the risk-of-bias was high in two studies, moderate in one study and low in one study. The search strategy included a combination of subject terms and free text terms. The medical subject headings (MeSH) terms included were; contracept* or ‘birth control’ or ‘family planning’ or pregnan* or ‘safe sex’ or ‘protected intercourse’ or ‘unprotected intercourse’ or ‘protected sex’ or ‘unprotected sex’ or ‘risky sexual behaviour*’ or ‘abort*’ or ‘planned parenthood’ or ‘unplanned pregnancy’ AND ( barrier* or obstacle* or challenge* or knowledge or attitude* or factor* or determinant* or choic* or uptake or discontinu* or acceptance or satisfaction or ‘needs assessment’ or ‘non-use’ or ‘unmet need’ or ‘decision making’ ) AND Cambodia*. Initially, 300 studies were identified by using key words and finally, four quantitative studies were selected based on the inclusion criteria. The four studies were published between 2010 and 2016. The study participants ranged in age from 10-24 years, single or married, with 3 to 10 completed years of education. The mean age at sexual debut was reported to be 18 years. Using the perspective of the Social Ecological Model, risky sexual behavior was associated with individual-level factors including young age at sexual debut, low education, unsafe sex under the influence of alcohol and substance abuse, multiple sexual partners or transactional sex. Family level factors included living away from parents, orphan status and low levels of family support. Peer and partner level factors included peer delinquency and lack of condom use. Low socioeconomic status at the society level was also associated with risky sexual behaviour. There is scant research on sexual and reproductive health of adolescent girls and young women in Cambodia. Individual, family and social factors were significantly associated with risky sexual behaviour. More research is required to inform potential preventive strategies and policies that address young women’s sexual and reproductive health.

Keywords: adolescents, high-risk sex, sexual activity, unplanned pregnancies

Procedia PDF Downloads 246
2642 Integrative Omics-Portrayal Disentangles Molecular Heterogeneity and Progression Mechanisms of Cancer

Authors: Binder Hans

Abstract:

Cancer is no longer seen as solely a genetic disease where genetic defects such as mutations and copy number variations affect gene regulation and eventually lead to aberrant cell functioning which can be monitored by transcriptome analysis. It has become obvious that epigenetic alterations represent a further important layer of (de-)regulation of gene activity. For example, aberrant DNA methylation is a hallmark of many cancer types, and methylation patterns were successfully used to subtype cancer heterogeneity. Hence, unraveling the interplay between different omics levels such as genome, transcriptome and epigenome is inevitable for a mechanistic understanding of molecular deregulation causing complex diseases such as cancer. This objective requires powerful downstream integrative bioinformatics methods as an essential prerequisite to discover the whole genome mutational, transcriptome and epigenome landscapes of cancer specimen and to discover cancer genesis, progression and heterogeneity. Basic challenges and tasks arise ‘beyond sequencing’ because of the big size of the data, their complexity, the need to search for hidden structures in the data, for knowledge mining to discover biological function and also systems biology conceptual models to deduce developmental interrelations between different cancer states. These tasks are tightly related to cancer biology as an (epi-)genetic disease giving rise to aberrant genomic regulation under micro-environmental control and clonal evolution which leads to heterogeneous cellular states. Machine learning algorithms such as self organizing maps (SOM) represent one interesting option to tackle these bioinformatics tasks. The SOMmethod enables recognizing complex patterns in large-scale data generated by highthroughput omics technologies. It portrays molecular phenotypes by generating individualized, easy to interpret images of the data landscape in combination with comprehensive analysis options. Our image-based, reductionist machine learning methods provide one interesting perspective how to deal with massive data in the discovery of complex diseases, gliomas, melanomas and colon cancer on molecular level. As an important new challenge, we address the combined portrayal of different omics data such as genome-wide genomic, transcriptomic and methylomic ones. The integrative-omics portrayal approach is based on the joint training of the data and it provides separate personalized data portraits for each patient and data type which can be analyzed by visual inspection as one option. The new method enables an integrative genome-wide view on the omics data types and the underlying regulatory modes. It is applied to high and low-grade gliomas and to melanomas where it disentangles transversal and longitudinal molecular heterogeneity in terms of distinct molecular subtypes and progression paths with prognostic impact.

Keywords: integrative bioinformatics, machine learning, molecular mechanisms of cancer, gliomas and melanomas

Procedia PDF Downloads 148
2641 IoT Based Soil Moisture Monitoring System for Indoor Plants

Authors: Gul Rahim Rahimi

Abstract:

The IoT-based soil moisture monitoring system for indoor plants is designed to address the challenges of maintaining optimal moisture levels in soil for plant growth and health. The system utilizes sensor technology to collect real-time data on soil moisture levels, which is then processed and analyzed using machine learning algorithms. This allows for accurate and timely monitoring of soil moisture levels, ensuring plants receive the appropriate amount of water to thrive. The main objectives of the system are twofold: to keep plants fresh and healthy by preventing water deficiency and to provide users with comprehensive insights into the water content of the soil on a daily and hourly basis. By monitoring soil moisture levels, users can identify patterns and trends in water consumption, allowing for more informed decision-making regarding watering schedules and plant care. The scope of the system extends to the agriculture industry, where it can be utilized to minimize the efforts required by farmers to monitor soil moisture levels manually. By automating the process of soil moisture monitoring, farmers can optimize water usage, improve crop yields, and reduce the risk of plant diseases associated with over or under-watering. Key technologies employed in the system include the Capacitive Soil Moisture Sensor V1.2 for accurate soil moisture measurement, the Node MCU ESP8266-12E Board for data transmission and communication, and the Arduino framework for programming and development. Additionally, machine learning algorithms are utilized to analyze the collected data and provide actionable insights. Cloud storage is utilized to store and manage the data collected from multiple sensors, allowing for easy access and retrieval of information. Overall, the IoT-based soil moisture monitoring system offers a scalable and efficient solution for indoor plant care, with potential applications in agriculture and beyond. By harnessing the power of IoT and machine learning, the system empowers users to make informed decisions about plant watering, leading to healthier and more vibrant indoor environments.

Keywords: IoT-based, soil moisture monitoring, indoor plants, water management

Procedia PDF Downloads 51
2640 Fully Autonomous Vertical Farm to Increase Crop Production

Authors: Simone Cinquemani, Lorenzo Mantovani, Aleksander Dabek

Abstract:

New technologies in agriculture are opening new challenges and new opportunities. Among these, certainly, robotics, vision, and artificial intelligence are the ones that will make a significant leap, compared to traditional agricultural techniques, possible. In particular, the indoor farming sector will be the one that will benefit the most from these solutions. Vertical farming is a new field of research where mechanical engineering can bring knowledge and know-how to transform a highly labor-based business into a fully autonomous system. The aim of the research is to develop a multi-purpose, modular, and perfectly integrated platform for crop production in indoor vertical farming. Activities will be based both on hardware development such as automatic tools to perform different activities on soil and plants, as well as research to introduce an extensive use of monitoring techniques based on machine learning algorithms. This paper presents the preliminary results of a research project of a vertical farm living lab designed to (i) develop and test vertical farming cultivation practices, (ii) introduce a very high degree of mechanization and automation that makes all processes replicable, fully measurable, standardized and automated, (iii) develop a coordinated control and management environment for autonomous multiplatform or tele-operated robots in environments with the aim of carrying out complex tasks in the presence of environmental and cultivation constraints, (iv) integrate AI-based algorithms as decision support system to improve quality production. The coordinated management of multiplatform systems still presents innumerable challenges that require a strongly multidisciplinary approach right from the design, development, and implementation phases. The methodology is based on (i) the development of models capable of describing the dynamics of the various platforms and their interactions, (ii) the integrated design of mechatronic systems able to respond to the needs of the context and to exploit the strength characteristics highlighted by the models, (iii) implementation and experimental tests performed to test the real effectiveness of the systems created, evaluate any weaknesses so as to proceed with a targeted development. To these aims, a fully automated laboratory for growing plants in vertical farming has been developed and tested. The living lab makes extensive use of sensors to determine the overall state of the structure, crops, and systems used. The possibility of having specific measurements for each element involved in the cultivation process makes it possible to evaluate the effects of each variable of interest and allows for the creation of a robust model of the system as a whole. The automation of the laboratory is completed with the use of robots to carry out all the necessary operations, from sowing to handling to harvesting. These systems work synergistically thanks to the knowledge of detailed models developed based on the information collected, which allows for deepening the knowledge of these types of crops and guarantees the possibility of tracing every action performed on each single plant. To this end, artificial intelligence algorithms have been developed to allow synergistic operation of all systems.

Keywords: automation, vertical farming, robot, artificial intelligence, vision, control

Procedia PDF Downloads 40
2639 Double Burden of Malnutrition among Children under Five in Sub-Saharan Africa and Other Least Developed Countries: A Systematic Review

Authors: Getenet Dessie, Jinhu Li, Son Nghiem, Tinh Doan

Abstract:

Background: Concerns regarding malnutrition have evolved from focusing solely on single forms to addressing the simultaneous occurrence of multiple types, commonly referred to as the double or triple burden of malnutrition. Nevertheless, data concerning the concurrent occurrence of various types of malnutrition are scarce. Therefore, this systematic review and meta-analysis aims to assess the pooled prevalence of the double burden of malnutrition among children under five in Sub-Saharan Africa and other least-developed countries (LDCs). Methods: Electronic, web-based searches were conducted from January 15 to June 28, 2023, across several databases, including PubMed, Embase, Google Scholar, and the World Health Organization's Hinari portal, as well as other search engines, to identify primary studies published up to June 28, 2023. Laboratory-based cross-sectional studies on children under the age of five were included. Two independent authors assessed the risk of bias and the quality of the identified articles. The primary outcomes of this study were micronutrient deficiencies and the comorbidity of stunting and anemia, as well as wasting and anemia. The random-effects model was utilized for analysis. The association of identified variables with the various forms of malnutrition was also assessed using adjusted odds ratios (AOR) with a 95% confidence interval (CI). This review was registered in PROSPERO with the reference number CRD42023409483. Findings: The electronic search generated 6,087 articles, 93 of which matched the inclusion criteria for the final meta-analysis. Micronutrient deficiencies were prevalent among children under five in Sub-Saharan Africa and other LDCs, with rates ranging from 16.63% among 25,169 participants for vitamin A deficiency to 50.90% among 3,936 participants for iodine deficiency. Iron deficiency anemia affected 20.56% of the 63,121 participants. The combined prevalence of wasting anemia and stunting anemia was 5.41% among 64,709 participants and 19.98% among 66,016 participants, respectively. Both stunting and vitamin A supplementation were associated with vitamin A and iron deficiencies, with adjusted odds ratios (AOR) of 1.54 (95% CI: 1.01, 2.37) and 1.37 (95% CI: 1.21, 1.55), respectively. Interpretation: The prevalence of the double burden of malnutrition among children under the age of five was notably high in Sub-Saharan Africa and other LDCs. These findings indicate a need for increased attention and a focus on understanding the factors influencing this double burden of malnutrition.

Keywords: children, Sub-Saharan Africa, least developed countries, double burden of malnutrition, systematic review, meta-analysis

Procedia PDF Downloads 81
2638 Artificial Intelligence for Traffic Signal Control and Data Collection

Authors: Reggie Chandra

Abstract:

Trafficaccidents and traffic signal optimization are correlated. However, 70-90% of the traffic signals across the USA are not synchronized. The reason behind that is insufficient resources to create and implement timing plans. In this work, we will discuss the use of a breakthrough Artificial Intelligence (AI) technology to optimize traffic flow and collect 24/7/365 accurate traffic data using a vehicle detection system. We will discuss what are recent advances in Artificial Intelligence technology, how does AI work in vehicles, pedestrians, and bike data collection, creating timing plans, and what is the best workflow for that. Apart from that, this paper will showcase how Artificial Intelligence makes signal timing affordable. We will introduce a technology that uses Convolutional Neural Networks (CNN) and deep learning algorithms to detect, collect data, develop timing plans and deploy them in the field. Convolutional Neural Networks are a class of deep learning networks inspired by the biological processes in the visual cortex. A neural net is modeled after the human brain. It consists of millions of densely connected processing nodes. It is a form of machine learning where the neural net learns to recognize vehicles through training - which is called Deep Learning. The well-trained algorithm overcomes most of the issues faced by other detection methods and provides nearly 100% traffic data accuracy. Through this continuous learning-based method, we can constantly update traffic patterns, generate an unlimited number of timing plans and thus improve vehicle flow. Convolutional Neural Networks not only outperform other detection algorithms but also, in cases such as classifying objects into fine-grained categories, outperform humans. Safety is of primary importance to traffic professionals, but they don't have the studies or data to support their decisions. Currently, one-third of transportation agencies do not collect pedestrian and bike data. We will discuss how the use of Artificial Intelligence for data collection can help reduce pedestrian fatalities and enhance the safety of all vulnerable road users. Moreover, it provides traffic engineers with tools that allow them to unleash their potential, instead of dealing with constant complaints, a snapshot of limited handpicked data, dealing with multiple systems requiring additional work for adaptation. The methodologies used and proposed in the research contain a camera model identification method based on deep Convolutional Neural Networks. The proposed application was evaluated on our data sets acquired through a variety of daily real-world road conditions and compared with the performance of the commonly used methods requiring data collection by counting, evaluating, and adapting it, and running it through well-established algorithms, and then deploying it to the field. This work explores themes such as how technologies powered by Artificial Intelligence can benefit your community and how to translate the complex and often overwhelming benefits into a language accessible to elected officials, community leaders, and the public. Exploring such topics empowers citizens with insider knowledge about the potential of better traffic technology to save lives and improve communities. The synergies that Artificial Intelligence brings to traffic signal control and data collection are unsurpassed.

Keywords: artificial intelligence, convolutional neural networks, data collection, signal control, traffic signal

Procedia PDF Downloads 169
2637 Cluster Based Ant Colony Routing Algorithm for Mobile Ad-Hoc Networks

Authors: Alaa Eddien Abdallah, Bajes Yousef Alskarnah

Abstract:

Ant colony based routing algorithms are known to grantee the packet delivery, but they su ffer from the huge overhead of control messages which are needed to discover the route. In this paper we utilize the network nodes positions to group the nodes in connected clusters. We use clusters-heads only on forwarding the route discovery control messages. Our simulations proved that the new algorithm has decreased the overhead dramatically without affecting the delivery rate.

Keywords: ad-hoc network, MANET, ant colony routing, position based routing

Procedia PDF Downloads 425
2636 Block Implicit Adams Type Algorithms for Solution of First Order Differential Equation

Authors: Asabe Ahmad Tijani, Y. A. Yahaya

Abstract:

The paper considers the derivation of implicit Adams-Moulton type method, with k=4 and 5. We adopted the method of interpolation and collocation of power series approximation to generate the continuous formula which was evaluated at off-grid and some grid points within the step length to generate the proposed block schemes, the schemes were investigated and found to be consistent and zero stable. Finally, the methods were tested with numerical experiments to ascertain their level of accuracy.

Keywords: Adam-Moulton Type (AMT), off-grid, block method, consistent and zero stable

Procedia PDF Downloads 482
2635 An Alternative Way to Mapping Cone

Authors: Yousuf Alkhezi

Abstract:

Since most of the literature on algebra does not make much deal with the special case of mapping cone. This paper is an alternative way to examine the special tensor product and mapping cone. Also, we show that the isomorphism that implies the mapping cone commutes with the tensor product for the ordinary tensor product no longer holds for the pinched tensor product. However, we show there is a morphism. We will introduce an alternative way of mapping cone. We are looking for more properties which is our future project. Also, we want to apply these new properties in some application. Many results and examples with classical algorithms will be provided.

Keywords: complex, tensor product, pinched tensore product, mapping cone

Procedia PDF Downloads 130
2634 The Role of Optimization and Machine Learning in e-Commerce Logistics in 2030

Authors: Vincenzo Capalbo, Gianpaolo Ghiani, Emanuele Manni

Abstract:

Global e-commerce sales have reached unprecedented levels in the past few years. As this trend is only predicted to go up as we continue into the ’20s, new challenges will be faced by companies when planning and controlling e-commerce logistics. In this paper, we survey the related literature on Optimization and Machine Learning as well as on combined methodologies. We also identify the distinctive features of next-generation planning algorithms - namely scalability, model-and-run features and learning capabilities - that will be fundamental to cope with the scale and complexity of logistics in the next decade.

Keywords: e-commerce, hardware acceleration, logistics, machine learning, mixed integer programming, optimization

Procedia PDF Downloads 253