Search results for: improved sparrow search algorithm
7534 Effects of Zinc and Vitamin A Supplementation on Prognostic Markers and Treatment Outcomes of Adults with Pulmonary Tuberculosis: A Systematic Review and Meta-Analysis
Authors: Fasil Wagnew, Kefyalew Addis Alene, Setegn Eshetie, Tom Wingfield, Matthew Kelly, Darren Gray
Abstract:
Introduction: Undernutrition is a major and under-appreciated risk factor for TB, which is estimated to be responsible for 1.9 million TB cases per year globally. The effectiveness of micronutrient supplementation on TB treatment outcomes and its prognostic markers such as sputum conversion and serum zinc, retinol, and hemoglobin levels has been poorly understood. This systematic review and meta-analysis aimed to determine the association between zinc and vitamin A supplementation and TB treatment outcomes and its prognostic markers. Methods: A systematic literature search for randomized controlled trials (RCTs) was performed in PubMed, Embase, and Scopus databases. Meta-analysis with a random effect model was performed to estimate risk ratio (RR) and mean difference (MD), with a 95% confidence interval (CI), for dichotomous and continuous outcomes, respectively. Results: Our search identified 2,195 records. Of these, nine RCTs consisting of 1,375 participants were included in the final analyses. Among adults with pulmonary TB, zinc (RR: 0.94, 95%CI: 0.86, 1.03), vitamin A (RR: 0.90, 95%CI: 0.80, 1.01), and combined zinc and vitamin A (RR: 0.98, 95%CI: 0.89, 1.08) supplementation were not significantly associated with TB treatment success. Combined zinc and vitamin A supplementation was significantly associated with increased sputum smear conversion at 2 months (RR: 1.16, 95%CI: 1.03, 1.32), serum zinc levels at 2 months (MD of 0.86umol/l, 95% CI: 0.14, 1.57), serum retinol levels at 2 months (MD: 0.06umol/l, 95 % CI: 0.04, 0.08) and 6 months (MD: 0.12umol/l, 95 % CI: 0.10, 0.14), and serum hemoglobin level at 6 months (MD: 0.29 ug/dl, 95% CI: 0.08 to 0.51), among adults with TB. Conclusions: Providing zinc and vitamin A supplementation to adults with pulmonary TB during treatment may increase early sputum smear conversion, serum zinc, retinol, and hemoglobin levels. However, the use of zinc, vitamin A, or both were not associated with TB treatment success.Keywords: zinc and vitamin A supplementation, tuberculosis, treatment outcomes, meta-analysis, RCT
Procedia PDF Downloads 1717533 Diagnostic Performance of Mean Platelet Volume in the Diagnosis of Acute Myocardial Infarction: A Meta-Analysis
Authors: Kathrina Aseanne Acapulco-Gomez, Shayne Julieane Morales, Tzar Francis Verame
Abstract:
Mean platelet volume (MPV) is the most accurate measure of the size of platelets and is routinely measured by most automated hematological analyzers. Several studies have shown associations between MPV and cardiovascular risks and outcomes. Although its measurement may provide useful data, MPV remains to be a diagnostic tool that is yet to be included in routine clinical decision making. The aim of this systematic review and meta-analysis is to determine summary estimates of the diagnostic accuracy of mean platelet volume for the diagnosis of myocardial infarction among adult patients with angina and/or its equivalents in terms of sensitivity, specificity, diagnostic odds ratio, and likelihood ratios, and to determine the difference of the mean MPV values between those with MI and those in the non-MI controls. The primary search was done through search in electronic databases PubMed, Cochrane Review CENTRAL, HERDIN (Health Research and Development Information Network), Google Scholar, Philippine Journal of Pathology, and Philippine College of Physicians Philippine Journal of Internal Medicine. The reference list of original reports was also searched. Cross-sectional, cohort, and case-control articles studying the diagnostic performance of mean platelet volume in the diagnosis of acute myocardial infarction in adult patients were included in the study. Studies were included if: (1) CBC was taken upon presentation to the ER or upon admission (within 24 hours of symptom onset); (2) myocardial infarction was diagnosed with serum markers, ECG, or according to accepted guidelines by the Cardiology societies (American Heart Association (AHA), American College of Cardiology (ACC), European Society of Cardiology (ESC); and, (3) if outcomes were measured as significant difference AND/OR sensitivity and specificity. The authors independently screened for inclusion of all the identified potential studies as a result of the search. Eligible studies were appraised using well-defined criteria. Any disagreement between the reviewers was resolved through discussion and consensus. The overall mean MPV value of those with MI (9.702 fl; 95% CI 9.07 – 10.33) was higher than in those of the non-MI control group (8.85 fl; 95% CI 8.23 – 9.46). Interpretation of the calculated t-value of 2.0827 showed that there was a significant difference in the mean MPV values of those with MI and those of the non-MI controls. The summary sensitivity (Se) and specificity (Sp) for MPV were 0.66 (95% CI; 0.59 - 0.73) and 0.60 (95% CI; 0.43 – 0.75), respectively. The pooled diagnostic odds ratio (DOR) was 2.92 (95% CI; 1.90 – 4.50). The positive likelihood ratio of MPV in the diagnosis of myocardial infarction was 1.65 (95% CI; 1.20 – 22.27), and the negative likelihood ratio was 0.56 (95% CI; 0.50 – 0.64). The intended role for MPV in the diagnostic pathway of myocardial infarction would perhaps be best as a triage tool. With a DOR of 2.92, MPV values can discriminate between those who have MI and those without. For a patient with angina presenting with elevated MPV values, it is 1.65 times more likely that he has MI. Thus, it is implied that the decision to treat a patient with angina or its equivalents as a case of MI could be supported by an elevated MPV value.Keywords: mean platelet volume, MPV, myocardial infarction, angina, chest pain
Procedia PDF Downloads 877532 Phantom and Clinical Evaluation of Block Sequential Regularized Expectation Maximization Reconstruction Algorithm in Ga-PSMA PET/CT Studies Using Various Relative Difference Penalties and Acquisition Durations
Authors: Fatemeh Sadeghi, Peyman Sheikhzadeh
Abstract:
Introduction: Block Sequential Regularized Expectation Maximization (BSREM) reconstruction algorithm was recently developed to suppress excessive noise by applying a relative difference penalty. The aim of this study was to investigate the effect of various strengths of noise penalization factor in the BSREM algorithm under different acquisition duration and lesion sizes in order to determine an optimum penalty factor by considering both quantitative and qualitative image evaluation parameters in clinical uses. Materials and Methods: The NEMA IQ phantom and 15 clinical whole-body patients with prostate cancer were evaluated. Phantom and patients were injected withGallium-68 Prostate-Specific Membrane Antigen(68 Ga-PSMA)and scanned on a non-time-of-flight Discovery IQ Positron Emission Tomography/Computed Tomography(PET/CT) scanner with BGO crystals. The data were reconstructed using BSREM with a β-value of 100-500 at an interval of 100. These reconstructions were compared to OSEM as a widely used reconstruction algorithm. Following the standard NEMA measurement procedure, background variability (BV), recovery coefficient (RC), contrast recovery (CR) and residual lung error (LE) from phantom data and signal-to-noise ratio (SNR), signal-to-background ratio (SBR) and tumor SUV from clinical data were measured. Qualitative features of clinical images visually were ranked by one nuclear medicine expert. Results: The β-value acts as a noise suppression factor, so BSREM showed a decreasing image noise with an increasing β-value. BSREM, with a β-value of 400 at a decreased acquisition duration (2 min/ bp), made an approximately equal noise level with OSEM at an increased acquisition duration (5 min/ bp). For the β-value of 400 at 2 min/bp duration, SNR increased by 43.7%, and LE decreased by 62%, compared with OSEM at a 5 min/bp duration. In both phantom and clinical data, an increase in the β-value is translated into a decrease in SUV. The lowest level of SUV and noise were reached with the highest β-value (β=500), resulting in the highest SNR and lowest SBR due to the greater noise reduction than SUV reduction at the highest β-value. In compression of BSREM with different β-values, the relative difference in the quantitative parameters was generally larger for smaller lesions. As the β-value decreased from 500 to 100, the increase in CR was 160.2% for the smallest sphere (10mm) and 12.6% for the largest sphere (37mm), and the trend was similar for SNR (-58.4% and -20.5%, respectively). BSREM visually was ranked more than OSEM in all Qualitative features. Conclusions: The BSREM algorithm using more iteration numbers leads to more quantitative accuracy without excessive noise, which translates into higher overall image quality and lesion detectability. This improvement can be used to shorter acquisition time.Keywords: BSREM reconstruction, PET/CT imaging, noise penalization, quantification accuracy
Procedia PDF Downloads 977531 Research on the Effectiveness of Online Guided Case Teaching in Problem-Based Learning: A Preschool Special Education Course
Authors: Chen-Ya Juan
Abstract:
Problem-Based Learning uses vague questions to guide student thinking and enhance their self-learning and collaboration. Most teachers implement PBL in a physical classroom, where teachers can monitor and evaluate students’ learning progress and guide them to search resources for answers. However, the prevalence of the Covid-19 in the world had changed from physical teaching to distance teaching. This instruction used many cases and applied Problem-Based Learning combined on the distance teaching via the internet for college students. This study involved an experimental group with PBL and a control group without PBL. The teacher divided all students in PBL class into eight groups, and 7~8 students in each group. The teacher assigned different cases for each group of the PBL class. Three stages of instruction were developed, including background knowledge of Learning, case analysis, and solving problems for each case. This study used a quantitative research method, a two-sample t-test, to find a significant difference in groups with PBL and without PBL. Findings indicated that PBL incased the average score of special education knowledge. The average score was improved by 20.46% in the PBL group and 15.4% without PBL. Results didn’t show significant differences (0.589>0.05) in special education professional knowledge. However, the feedback of the PBL students implied learning more about the application, problem-solving skills, and critical thinking. PBL students were more likely to apply professional knowledge on the actual case, find questions, resources, and answers. Most of them understood the importance of collaboration, working as a team, and communicating with other team members. The suggestions of this study included that (a) different web-based teaching instruments influenced student’s Learning; (b) it is difficult to monitor online PBL progress; (c) online PBL should be implemented flexible and multi-oriented; (d) although PBL did not show a significant difference on the group with PBL and without PBL, it did increase student’s problem-solving skills and critical thinking.Keywords: problem-based learning, college students, distance learning, case analysis, problem-solving
Procedia PDF Downloads 1307530 Mentoring of Health Professionals to Ensure Better Child-Birth and Newborn Care in Bihar, India: An Intervention Study
Authors: Aboli Gore, Aritra Das, Sunil Sonthalia, Tanmay Mahapatra, Sridhar Srikantiah, Hemant Shah
Abstract:
AMANAT is an initiative, taken in collaboration with the Government of Bihar, aimed at improving the Quality of Maternal and Neonatal care services at Bihar’s public health facilities – those offering either the Basic Emergency Obstetric and Neonatal care (BEmONC) or Comprehensive Emergency Obstetric and Neonatal care (CEmONC) services. The effectiveness of this program is evaluated by conducting cross-sectional assessments at the concerned facilities prior to (baseline) and following completion (endline) of intervention. Direct Observation of Delivery (DOD) methodology is employed for carrying out the baseline and endline assessments – through which key obstetric and neonatal care practices among the Health Care Providers (especially the nurses) are assessed quantitatively by specially trained nursing professionals. Assessment of vitals prior to delivery improved during all three phases of BEmONC and all four phases of CEmONC training with statistically significant improvement noted in: i) pulse measurement in BEmONC phase 2 (9% to 68%), 3 (4% to 57%) & 4 (14% to 59%) and CEmONC phase 2 (7% to 72%) and 3 (0% to 64%); ii) blood pressure measurement in BEmONC phase 2 (27% to 84%), 3 (21% to 76%) & 4 (36% to 71%) and CEmONC phase 2 (23% to 76%) and 3 (2% to 70%); iii) fetal heart rate measurement in BEmONC phase 2 (10% to 72%), 3 (11% to 77%) & 4 (13% to 64%) and CEmONC phase 1 (24% to 38%), 2 (14% to 82%) and 3 (1% to 73%); and iv) abdominal examination in BEmONC phase 2 (14% to 59%), 3 (3% to 59%) & 4 (6% to 56%) and CEmONC phase 1 (0% to 24%), 2 (7% to 62%) & 3 (0% to 62%). Regarding infection control, wearing of apron, mask and cap by the delivery conductors improved significantly in all BEmONC phases. Similarly, the practice of handwashing improved in all BEmONC and CEmONC phases. Even on disaggregation, the handwashing showed significant improvement in all phases but CEmONC phase-4. Not only the positive practices related to handwashing improved but also negative practices such as turning off the tap with bare hands declined significantly in the aforementioned phases. Significant decline was also noted in negative maternal care practices such as application of fundal pressure for hastening the delivery process and administration of oxytocin prior to delivery. One of the notable achievement of AMANAT is an improvement in active management of the third stage of labor (AMTSL). The overall AMTSL (including administration of oxytocin or other uterotonics uterotonic in proper dose, route and time along with controlled cord traction and uterine massage) improved in all phases of BEmONC and CEmONC mentoring. Another key area of improvement, across phases, was in proper cutting/clamping of the umbilical cord. AMANAT mentoring also led to improvement in important immediate newborn care practices such as initiation of skin-to-skin care and timely initiation of breastfeeding. The next phase of the mentoring program seeks to institutionalize mentoring across the state that could potentially perpetuate improvement with minimal external intervention.Keywords: capacity building, nurse-mentoring, quality of care, pregnancy, newborn care
Procedia PDF Downloads 1627529 Trauma in the Unconsoled: A Crisis of the Self
Authors: Assil Ghariri
Abstract:
This article studies the process of rewriting the self through memory in Kazuo Ishiguro’s novel, the Unconsoled (1995). It deals with the journey that the protagonist Mr. Ryder takes through the unconscious, in search for his real self, in which trauma stands as an obstacle. The article uses Carl Jung’s theory of archetypes. Trauma, in this article, is discussed as one of the true obstacles of the unconscious that prevent people from realizing the truth about their selves.Keywords: Carl Jung, Kazuo Ishiguro, memory, trauma
Procedia PDF Downloads 4027528 Automatic Classification of Periodic Heart Sounds Using Convolutional Neural Network
Authors: Jia Xin Low, Keng Wah Choo
Abstract:
This paper presents an automatic normal and abnormal heart sound classification model developed based on deep learning algorithm. MITHSDB heart sounds datasets obtained from the 2016 PhysioNet/Computing in Cardiology Challenge database were used in this research with the assumption that the electrocardiograms (ECG) were recorded simultaneously with the heart sounds (phonocardiogram, PCG). The PCG time series are segmented per heart beat, and each sub-segment is converted to form a square intensity matrix, and classified using convolutional neural network (CNN) models. This approach removes the need to provide classification features for the supervised machine learning algorithm. Instead, the features are determined automatically through training, from the time series provided. The result proves that the prediction model is able to provide reasonable and comparable classification accuracy despite simple implementation. This approach can be used for real-time classification of heart sounds in Internet of Medical Things (IoMT), e.g. remote monitoring applications of PCG signal.Keywords: convolutional neural network, discrete wavelet transform, deep learning, heart sound classification
Procedia PDF Downloads 3487527 ChatGPT 4.0 Demonstrates Strong Performance in Standardised Medical Licensing Examinations: Insights and Implications for Medical Educators
Authors: K. O'Malley
Abstract:
Background: The emergence and rapid evolution of large language models (LLMs) (i.e., models of generative artificial intelligence, or AI) has been unprecedented. ChatGPT is one of the most widely used LLM platforms. Using natural language processing technology, it generates customized responses to user prompts, enabling it to mimic human conversation. Responses are generated using predictive modeling of vast internet text and data swathes and are further refined and reinforced through user feedback. The popularity of LLMs is increasing, with a growing number of students utilizing these platforms for study and revision purposes. Notwithstanding its many novel applications, LLM technology is inherently susceptible to bias and error. This poses a significant challenge in the educational setting, where academic integrity may be undermined. This study aims to evaluate the performance of the latest iteration of ChatGPT (ChatGPT4.0) in standardized state medical licensing examinations. Methods: A considered search strategy was used to interrogate the PubMed electronic database. The keywords ‘ChatGPT’ AND ‘medical education’ OR ‘medical school’ OR ‘medical licensing exam’ were used to identify relevant literature. The search included all peer-reviewed literature published in the past five years. The search was limited to publications in the English language only. Eligibility was ascertained based on the study title and abstract and confirmed by consulting the full-text document. Data was extracted into a Microsoft Excel document for analysis. Results: The search yielded 345 publications that were screened. 225 original articles were identified, of which 11 met the pre-determined criteria for inclusion in a narrative synthesis. These studies included performance assessments in national medical licensing examinations from the United States, United Kingdom, Saudi Arabia, Poland, Taiwan, Japan and Germany. ChatGPT 4.0 achieved scores ranging from 67.1 to 88.6 percent. The mean score across all studies was 82.49 percent (SD= 5.95). In all studies, ChatGPT exceeded the threshold for a passing grade in the corresponding exam. Conclusion: The capabilities of ChatGPT in standardized academic assessment in medicine are robust. While this technology can potentially revolutionize higher education, it also presents several challenges with which educators have not had to contend before. The overall strong performance of ChatGPT, as outlined above, may lend itself to unfair use (such as the plagiarism of deliverable coursework) and pose unforeseen ethical challenges (arising from algorithmic bias). Conversely, it highlights potential pitfalls if users assume LLM-generated content to be entirely accurate. In the aforementioned studies, ChatGPT exhibits a margin of error between 11.4 and 32.9 percent, which resonates strongly with concerns regarding the quality and veracity of LLM-generated content. It is imperative to highlight these limitations, particularly to students in the early stages of their education who are less likely to possess the requisite insight or knowledge to recognize errors, inaccuracies or false information. Educators must inform themselves of these emerging challenges to effectively address them and mitigate potential disruption in academic fora.Keywords: artificial intelligence, ChatGPT, generative ai, large language models, licensing exam, medical education, medicine, university
Procedia PDF Downloads 327526 A Novel Combined Finger Counting and Finite State Machine Technique for ASL Translation Using Kinect
Authors: Rania Ahmed Kadry Abdel Gawad Birry, Mohamed El-Habrouk
Abstract:
This paper presents a brief survey of the techniques used for sign language recognition along with the types of sensors used to perform the task. It presents a modified method for identification of an isolated sign language gesture using Microsoft Kinect with the OpenNI framework. It presents the way of extracting robust features from the depth image provided by Microsoft Kinect and the OpenNI interface and to use them in creating a robust and accurate gesture recognition system, for the purpose of ASL translation. The Prime Sense’s Natural Interaction Technology for End-user - NITE™ - was also used in the C++ implementation of the system. The algorithm presents a simple finger counting algorithm for static signs as well as directional Finite State Machine (FSM) description of the hand motion in order to help in translating a sign language gesture. This includes both letters and numbers performed by a user, which in-turn may be used as an input for voice pronunciation systems.Keywords: American sign language, finger counting, hand tracking, Microsoft Kinect
Procedia PDF Downloads 2967525 Sexual Health in the Over Forty-Fives: A Cross-Europe Project
Authors: Tess Hartland, Moitree Banerjee, Sue Churchill, Antonina Pereira, Ian Tyndall, Ruth Lowry
Abstract:
Background: Sexual health services and policies for middle-aged and older adults are underdeveloped, while global sexually transmitted infections in this age group are on the rise. The Interreg cross-Europe Sexual Health In Over 45s (SHIFT) project aims to increase participation in sexual health services and improve sexual health and wellbeing in people aged over 45, with an additional focus on disadvantaged groups. Methods: A two-pronged mixed-methodology is being used to develop a model for good service provision in sexual health for over 45s. (1) Following PRISMA-ScR guidelines, a scoping review is being conducted, using the databases PsychINFO, Web of Science, ERIC and PubMed. A key search strategy using terms around sexual health, good practice, over 45s and disadvantaged groups. The initial search for literature yielded 7914 results. (2) Surveys (n=1000) based on the Theory of Planned Behaviour are being administered across the UK, Belgium and Netherlands to explore current sexual health knowledge, awareness and attitudes. Expected results: It is expected that sexual health needs and potential gaps in service provision will be identified in order to inform good practice for sexual health services for the target population. Results of the scoping review are being analysed, while focus group and survey data is being gathered. Preliminary analysis of the survey data highlights barriers to access such as limited risk awareness and stigma. All data analysis will be completed by the time of the conference. Discussion: Findings will inform the development of a model to improve sexual health and wellbeing for among over 45s, a population which is often missed in sexual health policy improvement.Keywords: adult health, disease prevention, health promotion, over 45s, sexual health
Procedia PDF Downloads 1307524 Encephalon-An Implementation of a Handwritten Mathematical Expression Solver
Authors: Shreeyam, Ranjan Kumar Sah, Shivangi
Abstract:
Recognizing and solving handwritten mathematical expressions can be a challenging task, particularly when certain characters are segmented and classified. This project proposes a solution that uses Convolutional Neural Network (CNN) and image processing techniques to accurately solve various types of equations, including arithmetic, quadratic, and trigonometric equations, as well as logical operations like logical AND, OR, NOT, NAND, XOR, and NOR. The proposed solution also provides a graphical solution, allowing users to visualize equations and their solutions. In addition to equation solving, the platform, called CNNCalc, offers a comprehensive learning experience for students. It provides educational content, a quiz platform, and a coding platform for practicing programming skills in different languages like C, Python, and Java. This all-in-one solution makes the learning process engaging and enjoyable for students. The proposed methodology includes horizontal compact projection analysis and survey for segmentation and binarization, as well as connected component analysis and integrated connected component analysis for character classification. The compact projection algorithm compresses the horizontal projections to remove noise and obtain a clearer image, contributing to the accuracy of character segmentation. Experimental results demonstrate the effectiveness of the proposed solution in solving a wide range of mathematical equations. CNNCalc provides a powerful and user-friendly platform for solving equations, learning, and practicing programming skills. With its comprehensive features and accurate results, CNNCalc is poised to revolutionize the way students learn and solve mathematical equations. The platform utilizes a custom-designed Convolutional Neural Network (CNN) with image processing techniques to accurately recognize and classify symbols within handwritten equations. The compact projection algorithm effectively removes noise from horizontal projections, leading to clearer images and improved character segmentation. Experimental results demonstrate the accuracy and effectiveness of the proposed solution in solving a wide range of equations, including arithmetic, quadratic, trigonometric, and logical operations. CNNCalc features a user-friendly interface with a graphical representation of equations being solved, making it an interactive and engaging learning experience for users. The platform also includes tutorials, testing capabilities, and programming features in languages such as C, Python, and Java. Users can track their progress and work towards improving their skills. CNNCalc is poised to revolutionize the way students learn and solve mathematical equations with its comprehensive features and accurate results.Keywords: AL, ML, hand written equation solver, maths, computer, CNNCalc, convolutional neural networks
Procedia PDF Downloads 1227523 Registration of Multi-Temporal Unmanned Aerial Vehicle Images for Facility Monitoring
Authors: Dongyeob Han, Jungwon Huh, Quang Huy Tran, Choonghyun Kang
Abstract:
Unmanned Aerial Vehicles (UAVs) have been used for surveillance, monitoring, inspection, and mapping. In this paper, we present a systematic approach for automatic registration of UAV images for monitoring facilities such as building, green house, and civil structures. The two-step process is applied; 1) an image matching technique based on SURF (Speeded up Robust Feature) and RANSAC (Random Sample Consensus), 2) bundle adjustment of multi-temporal images. Image matching to find corresponding points is one of the most important steps for the precise registration of multi-temporal images. We used the SURF algorithm to find a quick and effective matching points. RANSAC algorithm was used in the process of finding matching points between images and in the bundle adjustment process. Experimental results from UAV images showed that our approach has a good accuracy to be applied to the change detection of facility.Keywords: building, image matching, temperature, unmanned aerial vehicle
Procedia PDF Downloads 2927522 Optimal Design of Tuned Inerter Damper-Based System for the Control of Wind-Induced Vibration in Tall Buildings through Cultural Algorithm
Authors: Luis Lara-Valencia, Mateo Ramirez-Acevedo, Daniel Caicedo, Jose Brito, Yosef Farbiarz
Abstract:
Controlling wind-induced vibrations as well as aerodynamic forces, is an essential part of the structural design of tall buildings in order to guarantee the serviceability limit state of the structure. This paper presents a numerical investigation on the optimal design parameters of a Tuned Inerter Damper (TID) based system for the control of wind-induced vibration in tall buildings. The control system is based on the conventional TID, with the main difference that its location is changed from the ground level to the last two story-levels of the structural system. The TID tuning procedure is based on an evolutionary cultural algorithm in which the optimum design variables defined as the frequency and damping ratios were searched according to the optimization criteria of minimizing the root mean square (RMS) response of displacements at the nth story of the structure. A Monte Carlo simulation was used to represent the dynamic action of the wind in the time domain in which a time-series derived from the Davenport spectrum using eleven harmonic functions with randomly chosen phase angles was reproduced. The above-mentioned methodology was applied on a case-study derived from a 37-story prestressed concrete building with 144 m height, in which the wind action overcomes the seismic action. The results showed that the optimally tuned TID is effective to reduce the RMS response of displacements up to 25%, which demonstrates the feasibility of the system for the control of wind-induced vibrations in tall buildings.Keywords: evolutionary cultural algorithm, Monte Carlo simulation, tuned inerter damper, wind-induced vibrations
Procedia PDF Downloads 1357521 Optimization of Dez Dam Reservoir Operation Using Genetic Algorithm
Authors: Alireza Nikbakht Shahbazi, Emadeddin Shirali
Abstract:
Since optimization issues of water resources are complicated due to the variety of decision making criteria and objective functions, it is sometimes impossible to resolve them through regular optimization methods or, it is time or money consuming. Therefore, the use of modern tools and methods is inevitable in resolving such problems. An accurate and essential utilization policy has to be determined in order to use natural resources such as water reservoirs optimally. Water reservoir programming studies aim to determine the final cultivated land area based on predefined agricultural models and water requirements. Dam utilization rule curve is also provided in such studies. The basic information applied in water reservoir programming studies generally include meteorological, hydrological, agricultural and water reservoir related data, and the geometric characteristics of the reservoir. The system of Dez dam water resources was simulated applying the basic information in order to determine the capability of its reservoir to provide the objectives of the performed plan. As a meta-exploratory method, genetic algorithm was applied in order to provide utilization rule curves (intersecting the reservoir volume). MATLAB software was used in order to resolve the foresaid model. Rule curves were firstly obtained through genetic algorithm. Then the significance of using rule curves and the decrease in decision making variables in the system was determined through system simulation and comparing the results with optimization results (Standard Operating Procedure). One of the most essential issues in optimization of a complicated water resource system is the increasing number of variables. Therefore a lot of time is required to find an optimum answer and in some cases, no desirable result is obtained. In this research, intersecting the reservoir volume has been applied as a modern model in order to reduce the number of variables. Water reservoir programming studies has been performed based on basic information, general hypotheses and standards and applying monthly simulation technique for a statistical period of 30 years. Results indicated that application of rule curve prevents the extreme shortages and decrease the monthly shortages.Keywords: optimization, rule curve, genetic algorithm method, Dez dam reservoir
Procedia PDF Downloads 2657520 Alteration of Placental Development and Vascular Dysfunction in Gestational Diabetes Mellitus Has Impact on Maternal and Infant Health
Authors: Sadia Munir
Abstract:
The aim of this study is to investigate changes in placental development and vascular dysfunction which subsequently affect feto-maternal health in pregnancies complicated by gestational diabetes mellitus (GDM). Fetal and postnatal adverse health outcomes of GDM are shown to be associated with disturbances in placental structure and function. Children of women with GDM are more likely to be obese and diabetic in childhood and adulthood. GDM also increases the risk of adverse pregnancy outcomes, including preeclampsia, birth injuries, macrosomia and neonatal hypoglycemia, respiratory distress syndrome, neonatal cardiac dysfunction and stillbirth. Incidences of type 2 diabetes in the MENA region are growing at an alarming rate which is estimated to become more than double by 2030. Five of the top 10 countries for diabetes prevalence in 2010 were in the Gulf region. GDM also increases the risk of development of type 2 diabetes. Interestingly, more than half of the women with GDM develop diabetes later in their life. The human placenta is a temporary organ located at the interface between mother and fetal blood circulation. Placenta has a central role as both a producer as well as a target of several molecules that are involved in placental development and function. We have investigated performed a Pubmed search with key words placenta, GDM, placental villi, vascularization, cytokines, growth factors, inflammation, hypoxia, oxidative stress and pathophysiology. We have investigated differences in the development and vascularization of placenta, their underlying causes and impact on feto-maternal health through literature review. We have also identified gaps in the literature and research questions that need to be answered to completely understand the central role of placenta in the GDM. This study is important in understanding the pathophysiology of placenta due to changes in the vascularization of villi, surface area and diameter of villous capillaries in pregnancies complicated by GDM. It is necessary to understand these mechanisms in order to develop treatments to reverse their effects on placental malfunctioning, which in turn, will result in improved mother and child health.Keywords: gestational diabetes mellitus, placenta, vasculature, villi
Procedia PDF Downloads 3187519 Maximization of Lifetime for Wireless Sensor Networks Based on Energy Efficient Clustering Algorithm
Authors: Frodouard Minani
Abstract:
Since last decade, wireless sensor networks (WSNs) have been used in many areas like health care, agriculture, defense, military, disaster hit areas and so on. Wireless Sensor Networks consist of a Base Station (BS) and more number of wireless sensors in order to monitor temperature, pressure, motion in different environment conditions. The key parameter that plays a major role in designing a protocol for Wireless Sensor Networks is energy efficiency which is a scarcest resource of sensor nodes and it determines the lifetime of sensor nodes. Maximizing sensor node’s lifetime is an important issue in the design of applications and protocols for Wireless Sensor Networks. Clustering sensor nodes mechanism is an effective topology control approach for helping to achieve the goal of this research. In this paper, the researcher presents an energy efficiency protocol to prolong the network lifetime based on Energy efficient clustering algorithm. The Low Energy Adaptive Clustering Hierarchy (LEACH) is a routing protocol for clusters which is used to lower the energy consumption and also to improve the lifetime of the Wireless Sensor Networks. Maximizing energy dissipation and network lifetime are important matters in the design of applications and protocols for wireless sensor networks. Proposed system is to maximize the lifetime of the Wireless Sensor Networks by choosing the farthest cluster head (CH) instead of the closest CH and forming the cluster by considering the following parameter metrics such as Node’s density, residual-energy and distance between clusters (inter-cluster distance). In this paper, comparisons between the proposed protocol and comparative protocols in different scenarios have been done and the simulation results showed that the proposed protocol performs well over other comparative protocols in various scenarios.Keywords: base station, clustering algorithm, energy efficient, sensors, wireless sensor networks
Procedia PDF Downloads 1447518 Cooperative Diversity Scheme Based on MIMO-OFDM in Small Cell Network
Authors: Dong-Hyun Ha, Young-Min Ko, Chang-Bin Ha, Hyoung-Kyu Song
Abstract:
In Heterogeneous network (HetNet) can provide high quality of a service in a wireless communication system by composition of small cell networks. The composition of small cell networks improves cell coverage and capacity to the mobile users.Recently, various techniques using small cell networks have been researched in the wireless communication system. In this paper, the cooperative scheme obtaining high reliability is proposed in the small cell networks. The proposed scheme suggests a cooperative small cell system and the new signal transmission technique in the proposed system model. The new signal transmission technique applies a cyclic delay diversity (CDD) scheme based on the multiple input multiple output-orthogonal frequency division multiplexing (MIMO-OFDM) system to obtain improved performance. The improved performance of the proposed scheme is confirmed by the simulation results.Keywords: adaptive transmission, cooperative communication, diversity gain, OFDM
Procedia PDF Downloads 5027517 Approaches to Integrating Entrepreneurial Education in School Curriculum
Authors: Kofi Nkonkonya Mpuangnan, Samantha Govender, Hlengiwe Romualda Mhlongo
Abstract:
In recent years, a noticeable and worrisome pattern has emerged in numerous developing nations which is a steady and persistent rise in unemployment rates. This escalation of economic struggles has become a cause of great concern for parents who, having invested significant resources in their children's education, harboured hopes of achieving economic prosperity and stability for their families through secure employment. To effectively tackle this pressing unemployment issue, it is imperative to adopt a holistic approach, and a pivotal aspect of this approach involves incorporating entrepreneurial education seamlessly into the entire educational system. In this light, the authors explored approaches to integrating entrepreneurial education into school curriculum focusing on the following questions. How can an entrepreneurial mindset among learners be promoted in school? And how far have pedagogical approaches improved entrepreneurship in schools? To find answers to these questions, a systematic literature review underpinned by Human Capital Theory was adopted. This method was supported by the three stages of guidelines like planning, conducting, and reporting. The data were specifically sought from publishers with expansive coverage of scholarly literature like Sage, Taylor & Francis, Emirate, and Springer, covering publications from 1965 to 2023. The search was supported by two broad terms such as promoting entrepreneurial mindset in learners and pedagogical strategies for enhancing entrepreneurship. It was found that acquiring an entrepreneurial mindset through an innovative classroom environment, resilience, and guest speakers and industry experts. Also, teachers can promote entrepreneurial education through the adoption of pedagogical approaches such as hands-on learning and experiential activities, role-playing, business simulation games and creative and innovative teaching. It was recommended that the Ministry of Education should develop tailored training programs and workshops aimed at empowering educators with the essential competencies and insights to deliver impactful entrepreneurial education.Keywords: education, entrepreneurship, school curriculum, pedagogical approaches, integration
Procedia PDF Downloads 977516 A Spiral Dynamic Optimised Hybrid Fuzzy Logic Controller for a Unicycle Mobile Robot on Irregular Terrains
Authors: Abdullah M. Almeshal, Mohammad R. Alenezi, Talal H. Alzanki
Abstract:
This paper presents a hybrid fuzzy logic control strategy for a unicycle trajectory following robot on irregular terrains. In literature, researchers have presented the design of path tracking controllers of mobile robots on non-frictional surface. In this work, the robot is simulated to drive on irregular terrains with contrasting frictional profiles of peat and rough gravel. A hybrid fuzzy logic controller is utilised to stabilise and drive the robot precisely with the predefined trajectory and overcome the frictional impact. The controller gains and scaling factors were optimised using spiral dynamics optimisation algorithm to minimise the mean square error of the linear and angular velocities of the unicycle robot. The robot was simulated on various frictional surfaces and terrains and the controller was able to stabilise the robot with a superior performance that is shown via simulation results.Keywords: fuzzy logic control, mobile robot, trajectory tracking, spiral dynamic algorithm
Procedia PDF Downloads 4957515 Convex Restrictions for Outage Constrained MU-MISO Downlink under Imperfect Channel State Information
Authors: A. Preetha Priyadharshini, S. B. M. Priya
Abstract:
In this paper, we consider the MU-MISO downlink scenario, under imperfect channel state information (CSI). The main issue in imperfect CSI is to keep the probability of each user achievable outage rate below the given threshold level. Such a rate outage constraints present significant and analytical challenges. There are many probabilistic methods are used to minimize the transmit optimization problem under imperfect CSI. Here, decomposition based large deviation inequality and Bernstein type inequality convex restriction methods are used to perform the optimization problem under imperfect CSI. These methods are used for achieving improved output quality and lower complexity. They provide a safe tractable approximation of the original rate outage constraints. Based on these method implementations, performance has been evaluated in the terms of feasible rate and average transmission power. The simulation results are shown that all the two methods offer significantly improved outage quality and lower computational complexity.Keywords: imperfect channel state information, outage probability, multiuser- multi input single output, channel state information
Procedia PDF Downloads 8137514 High Level Synthesis of Canny Edge Detection Algorithm on Zynq Platform
Authors: Hanaa M. Abdelgawad, Mona Safar, Ayman M. Wahba
Abstract:
Real-time image and video processing is a demand in many computer vision applications, e.g. video surveillance, traffic management and medical imaging. The processing of those video applications requires high computational power. Therefore, the optimal solution is the collaboration of CPU and hardware accelerators. In this paper, a Canny edge detection hardware accelerator is proposed. Canny edge detection is one of the common blocks in the pre-processing phase of image and video processing pipeline. Our presented approach targets offloading the Canny edge detection algorithm from processing system (PS) to programmable logic (PL) taking the advantage of High Level Synthesis (HLS) tool flow to accelerate the implementation on Zynq platform. The resulting implementation enables up to a 100x performance improvement through hardware acceleration. The CPU utilization drops down and the frame rate jumps to 60 fps of 1080p full HD input video stream.Keywords: high level synthesis, canny edge detection, hardware accelerators, computer vision
Procedia PDF Downloads 4787513 An Amended Method for Assessment of Hypertrophic Scars Viscoelastic Parameters
Authors: Iveta Bryjova
Abstract:
Recording of viscoelastic strain-vs-time curves with the aid of the suction method and a follow-up analysis, resulting into evaluation of standard viscoelastic parameters, is a significant technique for non-invasive contact diagnostics of mechanical properties of skin and assessment of its conditions, particularly in acute burns, hypertrophic scarring (the most common complication of burn trauma) and reconstructive surgery. For elimination of the skin thickness contribution, usable viscoelastic parameters deduced from the strain-vs-time curves are restricted to the relative ones (i.e. those expressed as a ratio of two dimensional parameters), like grosselasticity, net-elasticity, biological elasticity or Qu’s area parameters, in literature and practice conventionally referred to as R2, R5, R6, R7, Q1, Q2, and Q3. With the exception of parameters R2 and Q1, the remaining ones substantially depend on the position of inflection point separating the elastic linear and viscoelastic segments of the strain-vs-time curve. The standard algorithm implemented in commercially available devices relies heavily on the experimental fact that the inflection time comes about 0.1 sec after the suction switch-on/off, which depreciates credibility of parameters thus obtained. Although the Qu’s US 7,556,605 patent suggests a method of improving the precision of the inflection determination, there is still room for nonnegligible improving. In this contribution, a novel method of inflection point determination utilizing the advantageous properties of the Savitzky–Golay filtering is presented. The method allows computation of derivatives of smoothed strain-vs-time curve, more exact location of inflection and consequently more reliable values of aforementioned viscoelastic parameters. An improved applicability of the five inflection-dependent relative viscoelastic parameters is demonstrated by recasting a former study under the new method, and by comparing its results with those provided by the methods that have been used so far.Keywords: Savitzky–Golay filter, scarring, skin, viscoelasticity
Procedia PDF Downloads 3047512 Facility Anomaly Detection with Gaussian Mixture Model
Authors: Sunghoon Park, Hank Kim, Jinwon An, Sungzoon Cho
Abstract:
Internet of Things allows one to collect data from facilities which are then used to monitor them and even predict malfunctions in advance. Conventional quality control methods focus on setting a normal range on a sensor value defined between a lower control limit and an upper control limit, and declaring as an anomaly anything falling outside it. However, interactions among sensor values are ignored, thus leading to suboptimal performance. We propose a multivariate approach which takes into account many sensor values at the same time. In particular Gaussian Mixture Model is used which is trained to maximize likelihood value using Expectation-Maximization algorithm. The number of Gaussian component distributions is determined by Bayesian Information Criterion. The negative Log likelihood value is used as an anomaly score. The actual usage scenario goes like a following. For each instance of sensor values from a facility, an anomaly score is computed. If it is larger than a threshold, an alarm will go off and a human expert intervenes and checks the system. A real world data from Building energy system was used to test the model.Keywords: facility anomaly detection, gaussian mixture model, anomaly score, expectation maximization algorithm
Procedia PDF Downloads 2727511 Towards an Enhanced Quality of IPTV Media Server Architecture over Software Defined Networking
Authors: Esmeralda Hysenbelliu
Abstract:
The aim of this paper is to present the QoE (Quality of Experience) IPTV SDN-based media streaming server enhanced architecture for configuring, controlling, management and provisioning the improved delivery of IPTV service application with low cost, low bandwidth, and high security. Furthermore, it is given a virtual QoE IPTV SDN-based topology to provide an improved IPTV service based on QoE Control and Management of multimedia services functionalities. Inside OpenFlow SDN Controller there are enabled in high flexibility and efficiency Service Load-Balancing Systems; based on the Loading-Balance module and based on GeoIP Service. This two Load-balancing system improve IPTV end-users Quality of Experience (QoE) with optimal management of resources greatly. Through the key functionalities of OpenFlow SDN controller, this approach produced several important features, opportunities for overcoming the critical QoE metrics for IPTV Service like achieving incredible Fast Zapping time (Channel Switching time) < 0.1 seconds. This approach enabled Easy and Powerful Transcoding system via FFMPEG encoder. It has the ability to customize streaming dimensions bitrates, latency management and maximum transfer rates ensuring delivering of IPTV streaming services (Audio and Video) in high flexibility, low bandwidth and required performance. This QoE IPTV SDN-based media streaming architecture unlike other architectures provides the possibility of Channel Exchanging between several IPTV service providers all over the word. This new functionality brings many benefits as increasing the number of TV channels received by end –users with low cost, decreasing stream failure time (Channel Failure time < 0.1 seconds) and improving the quality of streaming services.Keywords: improved quality of experience (QoE), OpenFlow SDN controller, IPTV service application, softwarization
Procedia PDF Downloads 1477510 KCBA, A Method for Feature Extraction of Colonoscopy Images
Authors: Vahid Bayrami Rad
Abstract:
In recent years, the use of artificial intelligence techniques, tools, and methods in processing medical images and health-related applications has been highlighted and a lot of research has been done in this regard. For example, colonoscopy and diagnosis of colon lesions are some cases in which the process of diagnosis of lesions can be improved by using image processing and artificial intelligence algorithms, which help doctors a lot. Due to the lack of accurate measurements and the variety of injuries in colonoscopy images, the process of diagnosing the type of lesions is a little difficult even for expert doctors. Therefore, by using different software and image processing, doctors can be helped to increase the accuracy of their observations and ultimately improve their diagnosis. Also, by using automatic methods, the process of diagnosing the type of disease can be improved. Therefore, in this paper, a deep learning framework called KCBA is proposed to classify colonoscopy lesions which are composed of several methods such as K-means clustering, a bag of features and deep auto-encoder. Finally, according to the experimental results, the proposed method's performance in classifying colonoscopy images is depicted considering the accuracy criterion.Keywords: colorectal cancer, colonoscopy, region of interest, narrow band imaging, texture analysis, bag of feature
Procedia PDF Downloads 577509 Mesoporous Na2Ti3O7 Nanotube-Constructed Materials with Hierarchical Architecture: Synthesis and Properties
Authors: Neumoin Anton Ivanovich, Opra Denis Pavlovich
Abstract:
Materials based on titanium oxide compounds are widely used in such areas as solar energy, photocatalysis, food industry and hygiene products, biomedical technologies, etc. Demand for them has also formed in the battery industry (an example of this is the commercialization of Li4Ti5O12), where much attention has recently been paid to the development of next-generation systems and technologies, such as sodium-ion batteries. This dictates the need to search for new materials with improved characteristics, as well as ways to obtain them that meet the requirements of scalability. One of the ways to solve these problems can be the creation of nanomaterials that often have a complex of physicochemical properties that radically differ from the characteristics of their counterparts in the micro- or macroscopic state. At the same time, it is important to control the texture (specific surface area, porosity) of such materials. In view of the above, among other methods, the hydrothermal technique seems to be suitable, allowing a wide range of control over the conditions of synthesis. In the present study, a method was developed for the preparation of mesoporous nanostructured sodium trititanate (Na2Ti3O7) with a hierarchical architecture. The materials were synthesized by hydrothermal processing and exhibit a complex hierarchically organized two-layer architecture. At the first level of the hierarchy, materials are represented by particles having a roughness surface, and at the second level, by one-dimensional nanotubes. The products were found to have high specific surface area and porosity with a narrow pore size distribution (about 6 nm). As it is known, the specific surface area and porosity are important characteristics of functional materials, which largely determine the possibilities and directions of their practical application. Electrochemical impedance spectroscopy data show that the resulting sodium trititanate has a sufficiently high electrical conductivity. As expected, the synthesized complexly organized nanoarchitecture based on sodium trititanate with a porous structure can be practically in demand, for example, in the field of new generation electrochemical storage and energy conversion devices.Keywords: sodium trititanate, hierarchical materials, mesoporosity, nanotubes, hydrothermal synthesis
Procedia PDF Downloads 1077508 A Near-Optimal Domain Independent Approach for Detecting Approximate Duplicates
Authors: Abdelaziz Fellah, Allaoua Maamir
Abstract:
We propose a domain-independent merging-cluster filter approach complemented with a set of algorithms for identifying approximate duplicate entities efficiently and accurately within a single and across multiple data sources. The near-optimal merging-cluster filter (MCF) approach is based on the Monge-Elkan well-tuned algorithm and extended with an affine variant of the Smith-Waterman similarity measure. Then we present constant, variable, and function threshold algorithms that work conceptually in a divide-merge filtering fashion for detecting near duplicates as hierarchical clusters along with their corresponding representatives. The algorithms take recursive refinement approaches in the spirit of filtering, merging, and updating, cluster representatives to detect approximate duplicates at each level of the cluster tree. Experiments show a high effectiveness and accuracy of the MCF approach in detecting approximate duplicates by outperforming the seminal Monge-Elkan’s algorithm on several real-world benchmarks and generated datasets.Keywords: data mining, data cleaning, approximate duplicates, near-duplicates detection, data mining applications and discovery
Procedia PDF Downloads 3877507 Functionalization of Polypropylene with Chiral Monomer for Improving Hemocompatibility
Authors: Xiaodong Xu, Dan Zhao, Xiujuan Chang, Chunming Li, Huiyun Zhou, Xin Li, Qiang Shi, Shifang Luan, Jinghua Yin
Abstract:
Polypropylene (PP) is one of the most commonly used plastics because of its low density, outstanding mechanical properties, and low cost. However, its drawbacks such as low surface energy, poor dyeability, lack of chemical functionalities, and poor compatibility with polar polymers and inorganic materials, have restricted the application of PP. To expand its application in biomedical materials, functionalization is considered to be the most effective way. In this study, PP was functionalized with a chiral monomer, (S)-1-acryloylpyrrolidine-2-carboxylic acid ((S)-APCA), by free-radical grafting in the solid phase. The grafting degree of PP-g-APCA was determined by chemical titration method, and the chemical structure of functionalized PP was characterized by FTIR spectroscopy, which confirmed that the chiral monomer (S)-APCA was successfully grafted onto PP. Static water contact angle results suggested that the surface hydrophilicity of PP was significantly improved by solid phase grafting and assistance of surface water treatment. Protein adsorption and platelet adhesion results showed that hemocompatibility of PP was greatly improved by grafting the chiral monomer.Keywords: functionalization, polypropylene, chiral monomer, hemocompatibility
Procedia PDF Downloads 3817506 Review of Published Articles on Climate Change and Health in Two Francophone Newspapers: 1990-2015
Authors: Mathieu Hemono, Sophie Puig-Malet, Patrick Zylberman, Avner Bar-Hen, Rainer Sauerborn, Stefanie Schütte, Niamh Herlihi, Antoine Flahault et Anneliese Depoux
Abstract:
Since the IPCC released its first report in 1990, an increasing number of peer-reviewed publications have reported the health risks associated with climate change. Although there is a large body of evidence supporting the association between climate change and poor health outcomes, the media is inconsistent in the attention it pays to the subject matter. This study aims to analyze the modalities and rhetoric in the media concerning the impact of climate change on health in order to better understand its role in information dissemination. A review was conducted of articles published between 1990 and 2015 in the francophone newspapers Le Monde and Jeune Afrique. A detailed search strategy including specific climate and health terminology was used to search the newspapers’ online databases. 1202 articles were identified as having referenced the terms climate change and health. Inclusion and exclusion criteria were applied to narrow the search to articles referencing the effects of climate change on human health and 160 articles were included in the final analysis. Data was extracted and categorized to create a structured database allowing for further investigation and analysis. The review indicated that although 66% of the selected newspaper articles reference scientific evidence of the impact of climate change on human health, the focus on the topic is limited major political events or is circumstances relating to public health crises. Main findings also include that among the many direct and indirect health outcomes, infectious diseases are the main health outcome highlighted in association with climate change. Lastly, the articles suggest that while developed countries have caused most of the greenhouse effect, the global south is more immediately affected. Overall, the reviewed articles reinforce the need for international cooperation in finding a solution to mitigate the effects of climate change on health. The manner in which scientific results are communicated and disseminated, impact individual and collective perceptions of the topic in the public sphere and affect political will to shape policy. The results of this analysis will underline the modalities of the rhetoric of transparency and provide the basis for a perception study of media discourses. This study is part of an interdisciplinary project called 4CHealth that confronts results of the research done on scientific, political and press literature to better understand how the knowledge on climate changes and health circulates within those different fields and whether and how it is translated to real world change.Keywords: climate change, health, health impacts, communication, media, rhetoric, awareness, Global South, Africa
Procedia PDF Downloads 4237505 Performences of Type-2 Fuzzy Logic Control and Neuro-Fuzzy Control Based on DPC for Grid Connected DFIG with Fixed Switching Frequency
Authors: Fayssal Amrane, Azeddine Chaiba
Abstract:
In this paper, type-2 fuzzy logic control (T2FLC) and neuro-fuzzy control (NFC) for a doubly fed induction generator (DFIG) based on direct power control (DPC) with a fixed switching frequency is proposed for wind generation application. First, a mathematical model of the doubly-fed induction generator implemented in d-q reference frame is achieved. Then, a DPC algorithm approach for controlling active and reactive power of DFIG via fixed switching frequency is incorporated using PID. The performance of T2FLC and NFC, which is based on the DPC algorithm, are investigated and compared to those obtained from the PID controller. Finally, simulation results demonstrate that the NFC is more robust, superior dynamic performance for wind power generation system applications.Keywords: doubly fed induction generator (DFIG), direct power control (DPC), neuro-fuzzy control (NFC), maximum power point tracking (MPPT), space vector modulation (SVM), type 2 fuzzy logic control (T2FLC)
Procedia PDF Downloads 419