Search results for: key frame extraction
828 Adaptive Process Monitoring for Time-Varying Situations Using Statistical Learning Algorithms
Authors: Seulki Lee, Seoung Bum Kim
Abstract:
Statistical process control (SPC) is a practical and effective method for quality control. The most important and widely used technique in SPC is a control chart. The main goal of a control chart is to detect any assignable changes that affect the quality output. Most conventional control charts, such as Hotelling’s T2 charts, are commonly based on the assumption that the quality characteristics follow a multivariate normal distribution. However, in modern complicated manufacturing systems, appropriate control chart techniques that can efficiently handle the nonnormal processes are required. To overcome the shortcomings of conventional control charts for nonnormal processes, several methods have been proposed to combine statistical learning algorithms and multivariate control charts. Statistical learning-based control charts, such as support vector data description (SVDD)-based charts, k-nearest neighbors-based charts, have proven their improved performance in nonnormal situations compared to that of the T2 chart. Beside the nonnormal property, time-varying operations are also quite common in real manufacturing fields because of various factors such as product and set-point changes, seasonal variations, catalyst degradation, and sensor drifting. However, traditional control charts cannot accommodate future condition changes of the process because they are formulated based on the data information recorded in the early stage of the process. In the present paper, we propose a SVDD algorithm-based control chart, which is capable of adaptively monitoring time-varying and nonnormal processes. We reformulated the SVDD algorithm into a time-adaptive SVDD algorithm by adding a weighting factor that reflects time-varying situations. Moreover, we defined the updating region for the efficient model-updating structure of the control chart. The proposed control chart simultaneously allows efficient model updates and timely detection of out-of-control signals. The effectiveness and applicability of the proposed chart were demonstrated through experiments with the simulated data and the real data from the metal frame process in mobile device manufacturing.Keywords: multivariate control chart, nonparametric method, support vector data description, time-varying process
Procedia PDF Downloads 299827 Localization of Geospatial Events and Hoax Prediction in the UFO Database
Authors: Harish Krishnamurthy, Anna Lafontant, Ren Yi
Abstract:
Unidentified Flying Objects (UFOs) have been an interesting topic for most enthusiasts and hence people all over the United States report such findings online at the National UFO Report Center (NUFORC). Some of these reports are a hoax and among those that seem legitimate, our task is not to establish that these events confirm that they indeed are events related to flying objects from aliens in outer space. Rather, we intend to identify if the report was a hoax as was identified by the UFO database team with their existing curation criterion. However, the database provides a wealth of information that can be exploited to provide various analyses and insights such as social reporting, identifying real-time spatial events and much more. We perform analysis to localize these time-series geospatial events and correlate with known real-time events. This paper does not confirm any legitimacy of alien activity, but rather attempts to gather information from likely legitimate reports of UFOs by studying the online reports. These events happen in geospatial clusters and also are time-based. We look at cluster density and data visualization to search the space of various cluster realizations to decide best probable clusters that provide us information about the proximity of such activity. A random forest classifier is also presented that is used to identify true events and hoax events, using the best possible features available such as region, week, time-period and duration. Lastly, we show the performance of the scheme on various days and correlate with real-time events where one of the UFO reports strongly correlates to a missile test conducted in the United States.Keywords: time-series clustering, feature extraction, hoax prediction, geospatial events
Procedia PDF Downloads 377826 A Comprehensive Review of Artificial Intelligence Applications in Sustainable Building
Authors: Yazan Al-Kofahi, Jamal Alqawasmi.
Abstract:
In this study, a comprehensive literature review (SLR) was conducted, with the main goal of assessing the existing literature about how artificial intelligence (AI), machine learning (ML), deep learning (DL) models are used in sustainable architecture applications and issues including thermal comfort satisfaction, energy efficiency, cost prediction and many others issues. For this reason, the search strategy was initiated by using different databases, including Scopus, Springer and Google Scholar. The inclusion criteria were used by two research strings related to DL, ML and sustainable architecture. Moreover, the timeframe for the inclusion of the papers was open, even though most of the papers were conducted in the previous four years. As a paper filtration strategy, conferences and books were excluded from database search results. Using these inclusion and exclusion criteria, the search was conducted, and a sample of 59 papers was selected as the final included papers in the analysis. The data extraction phase was basically to extract the needed data from these papers, which were analyzed and correlated. The results of this SLR showed that there are many applications of ML and DL in Sustainable buildings, and that this topic is currently trendy. It was found that most of the papers focused their discussions on addressing Environmental Sustainability issues and factors using machine learning predictive models, with a particular emphasis on the use of Decision Tree algorithms. Moreover, it was found that the Random Forest repressor demonstrates strong performance across all feature selection groups in terms of cost prediction of the building as a machine-learning predictive model.Keywords: machine learning, deep learning, artificial intelligence, sustainable building
Procedia PDF Downloads 67825 The Performance Improvement of Solar Aided Power Generation System by Introducing the Second Solar Field
Authors: Junjie Wu, Hongjuan Hou, Eric Hu, Yongping Yang
Abstract:
Solar aided power generation (SAPG) technology has been proven as an efficient way to make use of solar energy for power generation purpose. In an SAPG plant, a solar field consisting of parabolic solar collectors is normally used to supply the solar heat in order to displace the high pressure/temperature extraction steam. To understand the performance of such a SAPG plant, a new simulation model was developed by the authors recently, in which the boiler was treated, as a series of heat exchangers unlike other previous models. Through the simulations using the new model, it was found the outlet properties of reheated steam, e.g. temperature, would decrease due to the introduction of the solar heat. The changes make the (lower stage) turbines work under off-design condition. As a result, the whole plant’s performance may not be optimal. In this paper, the second solar filed was proposed to increase the inlet temperature of steam to be reheated, in order to bring the outlet temperature of reheated steam back to the designed condition. A 600MW SAPG plant was simulated as a case study using the new model to understand the impact of the second solar field on the plant performance. It was found in the study, the 2nd solar field would improve the plant’s performance in terms of cycle efficiency and solar-to-electricity efficiency by 1.91% and 6.01%. The solar-generated electricity produced by per aperture area under the design condition was 187.96W/m2, which was 26.14% higher than the previous design.Keywords: solar-aided power generation system, off-design performance, coal-saving performance, boiler modelling, integration schemes
Procedia PDF Downloads 290824 Unsupervised Segmentation Technique for Acute Leukemia Cells Using Clustering Algorithms
Authors: N. H. Harun, A. S. Abdul Nasir, M. Y. Mashor, R. Hassan
Abstract:
Leukaemia is a blood cancer disease that contributes to the increment of mortality rate in Malaysia each year. There are two main categories for leukaemia, which are acute and chronic leukaemia. The production and development of acute leukaemia cells occurs rapidly and uncontrollable. Therefore, if the identification of acute leukaemia cells could be done fast and effectively, proper treatment and medicine could be delivered. Due to the requirement of prompt and accurate diagnosis of leukaemia, the current study has proposed unsupervised pixel segmentation based on clustering algorithm in order to obtain a fully segmented abnormal white blood cell (blast) in acute leukaemia image. In order to obtain the segmented blast, the current study proposed three clustering algorithms which are k-means, fuzzy c-means and moving k-means algorithms have been applied on the saturation component image. Then, median filter and seeded region growing area extraction algorithms have been applied, to smooth the region of segmented blast and to remove the large unwanted regions from the image, respectively. Comparisons among the three clustering algorithms are made in order to measure the performance of each clustering algorithm on segmenting the blast area. Based on the good sensitivity value that has been obtained, the results indicate that moving k-means clustering algorithm has successfully produced the fully segmented blast region in acute leukaemia image. Hence, indicating that the resultant images could be helpful to haematologists for further analysis of acute leukaemia.Keywords: acute leukaemia images, clustering algorithms, image segmentation, moving k-means
Procedia PDF Downloads 291823 Artificial Intelligence Based Abnormality Detection System and Real Valuᵀᴹ Product Design
Authors: Junbeom Lee, Jaehyuck Cho, Wookyeong Jeong, Jonghan Won, Jungmin Hwang, Youngseok Song, Taikyeong Jeong
Abstract:
This paper investigates and analyzes meta-learning technologies that use multiple-cameras to monitor and check abnormal behavior in people in real-time in the area of healthcare fields. Advances in artificial intelligence and computer vision technologies have confirmed that cameras can be useful for individual health monitoring and abnormal behavior detection. Through this, it is possible to establish a system that can respond early by automatically detecting abnormal behavior of the elderly, such as patients and the elderly. In this paper, we use a technique called meta-learning to analyze image data collected from cameras and develop a commercial product to determine abnormal behavior. Meta-learning applies machine learning algorithms to help systems learn and adapt quickly to new real data. Through this, the accuracy and reliability of the abnormal behavior discrimination system can be improved. In addition, this study proposes a meta-learning-based abnormal behavior detection system that includes steps such as data collection and preprocessing, feature extraction and selection, and classification model development. Various healthcare scenarios and experiments analyze the performance of the proposed system and demonstrate excellence compared to other existing methods. Through this study, we present the possibility that camera-based meta-learning technology can be useful for monitoring and testing abnormal behavior in the healthcare area.Keywords: artificial intelligence, abnormal behavior, early detection, health monitoring
Procedia PDF Downloads 87822 A Neurofeedback Learning Model Using Time-Frequency Analysis for Volleyball Performance Enhancement
Authors: Hamed Yousefi, Farnaz Mohammadi, Niloufar Mirian, Navid Amini
Abstract:
Investigating possible capacities of visual functions where adapted mechanisms can enhance the capability of sports trainees is a promising area of research, not only from the cognitive viewpoint but also in terms of unlimited applications in sports training. In this paper, the visual evoked potential (VEP) and event-related potential (ERP) signals of amateur and trained volleyball players in a pilot study were processed. Two groups of amateur and trained subjects are asked to imagine themselves in the state of receiving a ball while they are shown a simulated volleyball field. The proposed method is based on a set of time-frequency features using algorithms such as Gabor filter, continuous wavelet transform, and a multi-stage wavelet decomposition that are extracted from VEP signals that can be indicative of being amateur or trained. The linear discriminant classifier achieves the accuracy, sensitivity, and specificity of 100% when the average of the repetitions of the signal corresponding to the task is used. The main purpose of this study is to investigate the feasibility of a fast, robust, and reliable feature/model determination as a neurofeedback parameter to be utilized for improving the volleyball players’ performance. The proposed measure has potential applications in brain-computer interface technology where a real-time biomarker is needed.Keywords: visual evoked potential, time-frequency feature extraction, short-time Fourier transform, event-related spectrum potential classification, linear discriminant analysis
Procedia PDF Downloads 138821 Emulsified Oil Removal in Produced Water by Graphite-Based Adsorbents Using Adsorption Coupled with Electrochemical Regeneration
Authors: Zohreh Fallah, Edward P. L. Roberts
Abstract:
One of the big challenges for produced water treatment is removing oil from water in the form of emulsified droplets which are not easily separated. An attractive approach is adsorption, as it is a simple and effective process. However, adsorbents must be regenerated in order to make the process cost effective. Several sorbents have been tested for treating oily wastewater. However, some issues such as high energy consumption for activated carbon thermal regeneration have been reported. Due to their significant electrical conductivity, Graphite Intercalation Compounds (GIC) were found to be suitable to be regenerated electrochemically. They are non-porous materials with low surface area and fast adsorptive capacity which are useful for removal of low concentration of organics. An innovative adsorption/regeneration process has been developed at the University of Manchester in which adsorption of organics are done by using a patented GIC adsorbent coupled with subsequent electrochemical regeneration. The oxidation of adsorbed organics enables 100% regeneration so that the adsorbent can be reused over multiple adsorption cycles. GIC adsorbents are capable of removing a wide range of organics and pollutants; however, no comparable report is available for removal of emulsified oil in produced water using abovementioned process. In this study the performance of this technology for the removal of emulsified oil in wastewater was evaluated. Batch experiments were carried out to determine the adsorption kinetics and equilibrium isotherm for both real produced water and model emulsions. The amount of oil in wastewater was measured by using the toluene extraction/fluorescence analysis before and after adsorption and electrochemical regeneration cycles. It was found that oil in water emulsion could be successfully treated by the treatment process and More than 70% of oil was removed.Keywords: adsorption, electrochemical regeneration, emulsified oil, produced water
Procedia PDF Downloads 582820 Environmental Potential of Biochar from Wood Biomass Thermochemical Conversion
Authors: Cora Bulmău
Abstract:
Soil polluted with hydrocarbons spills is a major global concern today. As a response to this issue, our experimental study tries to put in evidence the option to choose for one environmentally friendly method: use of the biochar, despite to a classical procedure; incineration of contaminated soil. Biochar represents the solid product obtained through the pyrolysis of biomass, its additional use being as an additive intended to improve the quality of the soil. The positive effect of biochar addition to soil is represented by its capacity to adsorb and contain petroleum products within its pores. Taking into consideration the capacity of the biochar to interact with organic contaminants, the purpose of the present study was to experimentally establish the effects of the addition of wooden biomass-derived biochar on a soil contaminated with oil. So, the contaminated soil was amended with biochar (10%) produced by pyrolysis in different operational conditions of the thermochemical process. After 25 days, the concentration of petroleum hydrocarbons from soil treated with biochar was measured. An analytical method as Soxhlet extraction was adopted to estimate the concentrations of total petroleum products (TPH) in the soil samples: This technique was applied to contaminated soil, also to soils remediated by incineration/adding biochar. The treatment of soil using biochar obtained from pyrolysis of the Birchwood led to a considerable decrease in the concentrations of petroleum products. The incineration treatments conducted under experimental stage to clean up the same soil, contaminated with petroleum products, involved specific parameters: temperature of about 600°C, 800°C and 1000°C and treatment time 30 and 60 minutes. The experimental results revealed that the method using biochar has registered values of efficiency up to those of all incineration processes applied for the shortest time.Keywords: biochar, biomass, remediaton, soil, TPH
Procedia PDF Downloads 236819 Presence and Absence: The Use of Photographs in Paris, Texas
Authors: Yi-Ting Wang, Wen-Shu Lai
Abstract:
The subject of this paper is the photography in the 1983 film Paris, Texas, directed by Wim Wenders. Wenders is well known as a film director as well as a photographer. We have found that photography is shown as a photographic element in many of his films. Some of these photographs serve as details within the films, while others play important roles that are relevant to the story. This paper aims to consider photographs in film as a specific type of text, which is the output of both still photography and the film itself. In the film Paris, Texas, three sets of important photographs appear whose symbolic meanings are as dialectical as their text types. The relationship between the existence of these photos and the storyline is both dependent and isolated. The film’s images fly by and progress into other images, while the photos in the film serve a unique narrative function by stopping the continuously flowing images thus provide the viewer a space for imagination and contemplation. They are more than just artistic forms; they also contained multiple meanings. The photographs in Paris, Texas play the role of both presence and absence according to their shifting meanings. There are references to their presence: photographs exist between film time and narrative time, so in terms of the interaction between the characters in the film, photographs are a common symbol of the beginning and end of the characters’ journeys. In terms of the audience, the film’s photographs are a link in the viewing frame structure, through which the creative motivation of the film director can be explored. Photographs also point to the absence of certain objects: the scenes in the photos represent an imaginary map of emotion. The town of Paris, Texas is therefore isolated from the physical presence of the photograph, and is far more abstract than the reality in the film. This paper embraces the ambiguous nature of photography and demonstrates its presence and absence in film with regard to the meaning of text. However, it is worth reflecting that the temporary nature of the interpretation of the film’s photographs is far greater than any other type of photographic text: the characteristics of the text cause the interpretation results to change along with the variations in the interpretation process, which makes their meaning a dynamic process. The photographs’ presence or absence in the context of Paris, Texas also demonstrates the presence and absence of the creator, time, the truth, and the imagination. The film becomes more complete as a result of the revelation of the photographs, while the intertextual connection between these two forms simultaneously provides multiple possibilities for the interpretation of the photographs in the film.Keywords: film, Paris, Texas, photography, Wim Wenders
Procedia PDF Downloads 319818 Effect of Three Drying Methods on Antioxidant Efficiency and Vitamin C Content of Moringa oleifera Leaf Extract
Authors: Kenia Martínez, Geniel Talavera, Juan Alonso
Abstract:
Moringa oleifera is a plant containing many nutrients that are mostly concentrated within the leaves. Commonly, the separation process of these nutrients involves solid-liquid extraction followed by evaporation and drying to obtain a concentrated extract, which is rich in proteins, vitamins, carbohydrates, and other essential nutrients that can be used in the food industry. In this work, three drying methods were used, which involved very different temperature and pressure conditions, to evaluate the effect of each method on the vitamin C content and the antioxidant efficiency of the extracts. Solid-liquid extractions of Moringa leaf (LE) were carried out by employing an ethanol solution (35% v/v) at 50 °C for 2 hours. The resulting extracts were then dried i) in a convective oven (CO) at 100 °C and at an atmospheric pressure of 750 mbar for 8 hours, ii) in a vacuum evaporator (VE) at 50 °C and at 300 mbar for 2 hours, and iii) in a freeze-drier (FD) at -40 °C and at 0.050 mbar for 36 hours. The antioxidant capacity (EC50, mg solids/g DPPH) of the dry solids was calculated by the free radical inhibition method employing DPPH˙ at 517 nm, resulting in a value of 2902.5 ± 14.8 for LE, 3433.1 ± 85.2 for FD, 3980.1 ± 37.2 for VE, and 8123.5 ± 263.3 for CO. The calculated antioxidant efficiency (AE, g DPPH/(mg solids·min)) was 2.920 × 10-5 for LE, 2.884 × 10-5 for FD, 2.512 × 10-5 for VE, and 1.009 × 10-5 for CO. Further, the content of vitamin C (mg/L) determined by HPLC was 59.0 ± 0.3 for LE, 49.7 ± 0.6 for FD, 45.0 ± 0.4 for VE, and 23.6 ± 0.7 for CO. The results indicate that the convective drying preserves vitamin C and antioxidant efficiency to 40% and 34% of the initial value, respectively, while vacuum drying to 76% and 86%, and freeze-drying to 84% and 98%, respectively.Keywords: antioxidant efficiency, convective drying, freeze-drying, Moringa oleifera, vacuum drying, vitamin C content
Procedia PDF Downloads 269817 Internet of Things Networks: Denial of Service Detection in Constrained Application Protocol Using Machine Learning Algorithm
Authors: Adamu Abdullahi, On Francisca, Saidu Isah Rambo, G. N. Obunadike, D. T. Chinyio
Abstract:
The paper discusses the potential threat of Denial of Service (DoS) attacks in the Internet of Things (IoT) networks on constrained application protocols (CoAP). As billions of IoT devices are expected to be connected to the internet in the coming years, the security of these devices is vulnerable to attacks, disrupting their functioning. This research aims to tackle this issue by applying mixed methods of qualitative and quantitative for feature selection, extraction, and cluster algorithms to detect DoS attacks in the Constrained Application Protocol (CoAP) using the Machine Learning Algorithm (MLA). The main objective of the research is to enhance the security scheme for CoAP in the IoT environment by analyzing the nature of DoS attacks and identifying a new set of features for detecting them in the IoT network environment. The aim is to demonstrate the effectiveness of the MLA in detecting DoS attacks and compare it with conventional intrusion detection systems for securing the CoAP in the IoT environment. Findings: The research identifies the appropriate node to detect DoS attacks in the IoT network environment and demonstrates how to detect the attacks through the MLA. The accuracy detection in both classification and network simulation environments shows that the k-means algorithm scored the highest percentage in the training and testing of the evaluation. The network simulation platform also achieved the highest percentage of 99.93% in overall accuracy. This work reviews conventional intrusion detection systems for securing the CoAP in the IoT environment. The DoS security issues associated with the CoAP are discussed.Keywords: algorithm, CoAP, DoS, IoT, machine learning
Procedia PDF Downloads 80816 Molecular Characterisation and Expression of Glutathione S-Transferase of Fasciola Gigantica
Authors: J. Adeppa, S. Samanta, O. K. Raina
Abstract:
Fasciolosis is a widespread economically important parasitic infection throughout the world caused by Fasciola hepatica and F. gigantica. In order to identify novel immunogen conferring significant protection against fasciolosis, currently, research has been focused on the defined antigens viz. glutathione S-transferase, fatty acid binding protein, cathepsin-L, fluke hemoglobin, paramyosin, myosin and F. hepatica- Kunitz Type Molecule. Among various antigens, GST which plays a crucial role in detoxification processes, i.e. phase II defense mechanism of this parasite, has a unique position as a novel vaccine candidate and a drug target in the control of this disease. For producing the antigens in large quantities and their purification to complete homogeneity, the recombinant DNA technology has become an important tool to achieve this milestone. RT- PCR was carried out using F. gigantica total RNA as template, and an amplicon of 657 bp GST gene was obtained. TA cloning vector was used for cloning of this gene, and the presence of insert was confirmed by blue-white selection for recombinant colonies. Sequence analysis of the present isolate showed 99.1% sequence homology with the published sequence of the F. gigantica GST gene of cattle origin (accession no. AF112657), with six nucleotide changes at 72, 74, 423, 513, 549 and 627th bp found in the present isolate, causing an overall change of 4 amino acids. The 657 bp GST gene was cloned at BamH1 and HindIII restriction sites of the prokaryotic expression vector pPROEXHTb in frame with six histidine residues and expressed in E. coli DH5α. Recombinant protein was purified from the bacterial lysate under non-denaturing conditions by the process of sonication after lysozyme treatment and subjecting the soluble fraction of the bacterial lysate to Ni-NTA affinity chromatography. Western blotting with rabbit hyper-immune serum showed immuno-reactivity with 25 kDa recombinant GST. Recombinant protein detected F. gigantica experimental as well as field infection in buffaloes by dot-ELISA. However, cross-reactivity studies on Fasciola gigantica GST antigen are needed to evaluate the utility of this protein in the serodiagnosis of fasciolosis.Keywords: fasciola gigantic, fasciola hepatica, GST, RT- PCR
Procedia PDF Downloads 186815 Management of Acute Biliary Pathology at Gozo General Hospital
Authors: Kristian Bugeja, Upeshala A. Jayawardena, Clarissa Fenech, Mark Zammit Vincenti
Abstract:
Introduction: Biliary colic, acute cholecystitis, and gallstone pancreatitis are some of the most common surgical presentations at Gozo General Hospital (GGH). National Institute for Health and Care Excellence (NICE) guidelines advise that suitable patients with acute biliary problems should be offered a laparoscopic cholecystectomy within one week of diagnosis. There has traditionally been difficulty in achieving this mainly due to the reluctance of some surgeons to operate in the acute setting, limited, timely access to MRCP and ERCP, and organizational issues. Methodology: A retrospective study was performed involving all biliary pathology-related admissions to GGH during the two-year period of 2019 and 2020. Patients’ files and electronic case summary (ECS) were used for data collection, which included demographic data, primary diagnosis, co-morbidities, management, waiting time to surgery, length of stay, readmissions, and reason for readmissions. NICE clinical guidance 188 – Gallstone disease were used as the standard. Results: 51 patients were included in the study. The mean age was 58 years, and 35 (68.6%) were female. The main diagnoses on admission were biliary colic in 31 (60.8%), acute cholecystitis in 10 (19.6%). Others included gallstone pancreatitis in 3 (5.89%), chronic cholecystitis in 2 (3.92%), gall bladder malignancy in 4 (7.84%), and ascending cholangitis in 1 (1.97%). Management included laparoscopic cholecystectomy in 34 (66.7%); conservative in 8 (15.7%) and ERCP in 6 (11.7%). The mean waiting time for laparoscopic cholecystectomy for patients with acute cholecystitis was 74 days – range being between 3 and 146 days since the date of diagnosis. Only one patient who was diagnosed with acute cholecystitis and managed with laparoscopic cholecystectomy was done so within the 7-day time frame. Hospital re-admissions were reported in 5 patients (9.8%) due to vomiting (1), ascending cholangitis (1), and gallstone pancreatitis (3). Discussion: Guidelines were not met for patients presenting to Gozo General Hospital with acute biliary pathology. This resulted in 5 patients being re-admitted to hospital while waiting for definitive surgery. The local issues resulting in the delay to surgery need to be identified and steps are taken to facilitate the provision of urgent cholecystectomy for suitable patients.Keywords: biliary colic, acute cholecystits, laparoscopic cholecystectomy, conservative management
Procedia PDF Downloads 161814 Evaluating the Success of an Intervention Course in a South African Engineering Programme
Authors: Alessandra Chiara Maraschin, Estelle Trengove
Abstract:
In South Africa, only 23% of engineering students attain their degrees in the minimum time of 4 years. This begs the question: Why is the 4-year throughput rate so low? Improving the throughput rate is crucial in assisting students to the shortest possible path to completion. The Electrical Engineering programme has a fixed curriculum and students must pass all courses in order to graduate. In South Africa, as is the case in several other countries, many students rely on external funding such as bursaries from companies in industry. If students fail a course, they often lose their bursaries, and most might not be able to fund their 'repeating year' fees. It is thus important to improve the throughput rate, since for many students, graduating from university is a way out of poverty for an entire family. In Electrical Engineering, it has been found that the Software Development I course (an introduction to C++ programming) is a significant hurdle course for students and has been found to have a low pass rate. It has been well-documented that students struggle with this type of course as it introduces a number of new threshold concepts that can be challenging to grasp in a short time frame. In an attempt to mitigate this situation, a part-time night-school for Software Development I was introduced in 2015 as an intervention measure. The course includes all the course material from the Software Development I module and allows students who failed the course in first semester a second chance by repeating the course through taking the night-school course. The purpose of this study is to determine whether the introduction of this intervention course could be considered a success. The success of the intervention is assessed in two ways. The study will first look at whether the night-school course contributed to improving the pass rate of the Software Development I course. Secondly, the study will examine whether the intervention contributed to improving the overall throughput from the 2nd year to the 3rd year of study at a South African University. Second year academic results for a sample of 1216 students have been collected from 2010-2017. Preliminary results show that the lowest pass rate for Software Development I was found to be in 2017 with a pass rate of 34.9%. Since the intervention course's inception, the pass rate for Software Development I has increased each year from 2015-2017 by 13.75%, 25.53% and 25.81% respectively. To conclude, the preliminary results show that the intervention course is a success in improving the pass rate of Software Development I.Keywords: academic performance, electrical engineering, engineering education, intervention course, low pass rate, software development course, throughput
Procedia PDF Downloads 164813 Infectivity of Hyalomma Ticks for Theileria annulata Using 18s rRNA PCR
Authors: Muhammad S. Sajid, A. Iqbal, A. Kausar, M. Jawad-ul-Hassan, Z. Iqbal, Hafiz M. Rizwan, M. Saqib
Abstract:
Among the ixodid ticks, species of genus Hyalomma are of prime importance as they can survive in harsh conditions better than those of other species. Similarly, among various tick-borne pathogens, Theileria (T.) annulata, the causative agent of tropical theileriosis in large ruminants, is responsible for reduced productivity and ultimately substantial economic losses due to morbidity and mortality. The present study was planned to screening of vector ticks through molecular techniques for determination of tick-borne theileriosis in district Toba Tek Singh (T. T. Singh), Punjab, Pakistan. For this purpose, among the collected ticks (n = 2252) from livestock and their microclimate, Hyalomma spp. were subjected to dissection for procurement of salivary glands (SGs) and formation of pool (averaged 8 acini in each pool). Each pool of acini was used for DNA extraction, quantification and primer-specific amplification of 18S rRNA of Theileria (T.) annulata. The amplicons were electrophoresed using 1.8% agarose gel following by imaging to identify the band specific for T. annulata. For confirmation, the positive amplicons were subjected to sequencing, BLAST analysis and homology search using NCBI software. The number of Theileria-infected acini was significantly higher (P < 0.05) in female ticks vs male ticks, infesting ticks vs questing ticks and riverine-collected vs non-riverine collected. The data provides first attempt to quantify the vectoral capacity of ixodid ticks in Pakistan for T. annulata which can be helpful in estimation of risk analysis of theileriosis to the domestic livestock population of the country.Keywords: Hyalomma anatolicum, ixodids, PCR, Theileria annulata
Procedia PDF Downloads 288812 Debating the Role of Patriarchy in the Incidence of Gender-Based Violence in Jordan: Systematic Review of the Literature
Authors: Nour Daoud
Abstract:
Patriarchy continues to thrive in Jordan where male-controlled values are still entrenched in a society that is suffering from upsetting percentages of Gender-based Violence (GBV). This paper is a systematic review of the literature with an attempt to evaluate and interpret all available research evidence relevant to determining the extent to which patriarchy contributes to the occurrence, re-occurrence, and continuation of GBV in Jordan. Twenty-one (21) full-text articles were selected for the in-depth review due to meeting the established criteria for inclusion. 81 percent of articles included primary data while 19 percent included secondary data. Analysis of data was based on a specific extraction form that was developed using the ‘Excel’ to respond to the main goal of the paper. Interpretation of data was in light of the theorization of different feminism schools on the relationship between patriarchy and gender-based violence. Findings show that 33 percent of the selected articles affirm that the patriarchal standpoint best explains the role of patriarchy in the incidence of gender-based violence in Jordan under its three main themes (Honor-based Violence, Intimate Partner Violence and Street Harassment). Apart from the limited number of articles that were found debating this argument and the low percentage of articles that acknowledged the role of patriarchy in the incidence of gender-based violence in Jordan, this paper breaks the ice to implement future empirical studies on this subject. Also, it is an invitation for all Jordanian women to unite their efforts in order to eradicate all forms of victimization against them.Keywords: honor-based violence, intimate partner violence, middle-east, street harassment
Procedia PDF Downloads 226811 HLB Disease Detection in Omani Lime Trees using Hyperspectral Imaging Based Techniques
Authors: Jacintha Menezes, Ramalingam Dharmalingam, Palaiahnakote Shivakumara
Abstract:
In the recent years, Omani acid lime cultivation and production has been affected by Citrus greening or Huanglongbing (HLB) disease. HLB disease is one of the most destructive diseases for citrus, with no remedies or countermeasures to stop the disease. Currently used Polymerase chain reaction (PCR) and enzyme-linked immunosorbent assay (ELISA) HLB detection tests require lengthy and labor-intensive laboratory procedures. Furthermore, the equipment and staff needed to carry out the laboratory procedures are frequently specialized hence making them a less optimal solution for the detection of the disease. The current research uses hyperspectral imaging technology for automatic detection of citrus trees with HLB disease. Omani citrus tree leaf images were captured through portable Specim IQ hyperspectral camera. The research considered healthy, nutrition deficient, and HLB infected leaf samples based on the Polymerase chain reaction (PCR) test. The highresolution image samples were sliced to into sub cubes. The sub cubes were further processed to obtain RGB images with spatial features. Similarly, RGB spectral slices were obtained through a moving window on the wavelength. The resized spectral-Spatial RGB images were given to Convolution Neural Networks for deep features extraction. The current research was able to classify a given sample to the appropriate class with 92.86% accuracy indicating the effectiveness of the proposed techniques. The significant bands with a difference in three types of leaves are found to be 560nm, 678nm, 726 nm and 750nm.Keywords: huanglongbing (HLB), hyperspectral imaging (HSI), · omani citrus, CNN
Procedia PDF Downloads 80810 A Novel Method for Isolation of Kaempferol and Quercetin from Podophyllum Hexandrum Rhizome
Authors: S. B. Bhandare, K. S. Laddha
Abstract:
Podphyllum hexandrum belonging to family berberidaceae has gained attention in phytochemical and pharmacological research as it shows excellent anticancer activity and has been used in treatment of skin diseases, sunburns and radioprotection. Chemically it contains lignans and flavonoids such as kaempferol, quercetin and their glycosides. Objective: To isolate and identify Kaempferol and Quercetin from Podophyllum rhizome. Method: The powdered rhizome of Podophyllum hexandrum was subjected to soxhlet extraction with methanol. This methanolic extract is used to obtain podophyllin. Podohyllin was extracted with ethyl acetate and this extract was then concentrated and subjected to column chromatography to obtain purified kaempferol and quercetin. Result: Isolated kaempferol, quercetin were light yellow and dark yellow in colour respectively. TLC of the isolated compounds was performed using chloroform: methanol (9:1) which showed single band on silica plate at Rf 0.6 and 0.4 for kaempferol and quercetin. UV spectrometric studies showed UV maxima (methanol) at 259, 360 nm and 260, 370 nm which are identical with standard kaempferol and quercetin respectively. Both IR spectra exhibited prominent absorption bands for free phenolic OH at 3277 and 3296.2 cm-1 and for conjugated C=O at 1597 and 1659.7 cm-1 respectively. The mass spectrum of kaempferol and quercetin showed (M+1) peak at m/z 287 and 303.09 respectively. 1H NMR analysis of both isolated compounds exhibited typical four-peak pattern of two doublets at δ 6.86 and δ 8.01 which was assigned to H-3’,5’ and H-2’,6’ respectively. Absence of signals less than δ 6.81 in the 1H NMR spectrum supported the aromatic nature of compound. Kaempferol and Quercetin showed 98.1% and 97% purity by HPLC at UV 370 nm. Conclusion: Easy and simple method for isolation of Kaempferol and Quercetin was developed and their structures were confirmed by UV, IR, NMR and mass studies. Method has shown good reproducibility, yield and purity.Keywords: flavonoids, kaempferol, podophyllum rhizome, quercetin
Procedia PDF Downloads 304809 Radiographic Predictors of Mandibular Third Molar Extraction Difficulties under General Anaesthetic
Authors: Carolyn Whyte, Tina Halai, Sonita Koshal
Abstract:
Aim: There are many methods available to assess the potential difficulty of third molar surgery. This study investigated various factors to assess whether they had a bearing on the difficulties encountered. Study design: A retrospective study was completed of 62 single mandibular third molar teeth removed under day case general anaesthesia between May 2016 and August 2016 by 3 consultant oral surgeons. Method: Data collection was by examining the OPG radiographs of each tooth and recording the necessary data. This was depth of impaction, angulation, bony impaction, point of application in relation to second molar, root morphology, Pell and Gregory classification and Winters Lines. This was completed by one assessor and verified by another. Information on medical history, anxiety, ethnicity and age were recorded. Case notes and surgical entries were examined for any difficulties encountered. Results: There were 5 cases which encountered surgical difficulties which included fracture of root apices (3) which were left in situ, prolonged bleeding (1) and post-operative numbness >6 months(1). Four of the 5 cases had Pell and Gregory classification as (B) where the occlusal plane of the impacted tooth is between the occlusal plane and the cervical line of the adjacent tooth. 80% of cases had the point of application as either coronal or apical one third (1/3) in relation to the second molar. However, there was variability in all other aspects of assessment in predicting difficulty of removal. Conclusions: Of the cases which encountered difficulties they all had at least one predictor of potential complexity but these varied case by case.Keywords: impaction, mandibular third molar, radiographic assessment, surgical removal
Procedia PDF Downloads 181808 Pilot-free Image Transmission System of Joint Source Channel Based on Multi-Level Semantic Information
Authors: Linyu Wang, Liguo Qiao, Jianhong Xiang, Hao Xu
Abstract:
In semantic communication, the existing joint Source Channel coding (JSCC) wireless communication system without pilot has unstable transmission performance and can not effectively capture the global information and location information of images. In this paper, a pilot-free image transmission system of joint source channel based on multi-level semantic information (Multi-level JSCC) is proposed. The transmitter of the system is composed of two networks. The feature extraction network is used to extract the high-level semantic features of the image, compress the information transmitted by the image, and improve the bandwidth utilization. Feature retention network is used to preserve low-level semantic features and image details to improve communication quality. The receiver also is composed of two networks. The received high-level semantic features are fused with the low-level semantic features after feature enhancement network in the same dimension, and then the image dimension is restored through feature recovery network, and the image location information is effectively used for image reconstruction. This paper verifies that the proposed multi-level JSCC algorithm can effectively transmit and recover image information in both AWGN channel and Rayleigh fading channel, and the peak signal-to-noise ratio (PSNR) is improved by 1~2dB compared with other algorithms under the same simulation conditions.Keywords: deep learning, JSCC, pilot-free picture transmission, multilevel semantic information, robustness
Procedia PDF Downloads 120807 Automatic Staging and Subtype Determination for Non-Small Cell Lung Carcinoma Using PET Image Texture Analysis
Authors: Seyhan Karaçavuş, Bülent Yılmaz, Ömer Kayaaltı, Semra İçer, Arzu Taşdemir, Oğuzhan Ayyıldız, Kübra Eset, Eser Kaya
Abstract:
In this study, our goal was to perform tumor staging and subtype determination automatically using different texture analysis approaches for a very common cancer type, i.e., non-small cell lung carcinoma (NSCLC). Especially, we introduced a texture analysis approach, called Law’s texture filter, to be used in this context for the first time. The 18F-FDG PET images of 42 patients with NSCLC were evaluated. The number of patients for each tumor stage, i.e., I-II, III or IV, was 14. The patients had ~45% adenocarcinoma (ADC) and ~55% squamous cell carcinoma (SqCCs). MATLAB technical computing language was employed in the extraction of 51 features by using first order statistics (FOS), gray-level co-occurrence matrix (GLCM), gray-level run-length matrix (GLRLM), and Laws’ texture filters. The feature selection method employed was the sequential forward selection (SFS). Selected textural features were used in the automatic classification by k-nearest neighbors (k-NN) and support vector machines (SVM). In the automatic classification of tumor stage, the accuracy was approximately 59.5% with k-NN classifier (k=3) and 69% with SVM (with one versus one paradigm), using 5 features. In the automatic classification of tumor subtype, the accuracy was around 92.7% with SVM one vs. one. Texture analysis of FDG-PET images might be used, in addition to metabolic parameters as an objective tool to assess tumor histopathological characteristics and in automatic classification of tumor stage and subtype.Keywords: cancer stage, cancer cell type, non-small cell lung carcinoma, PET, texture analysis
Procedia PDF Downloads 326806 The Automatic Transliteration Model of Images of the Book Hamong Tani Using Statistical Approach
Authors: Agustinus Rudatyo Himamunanto, Anastasia Rita Widiarti
Abstract:
Transliteration using Javanese manuscripts is one of methods to preserve and legate the wealth of literature in the past for the present generation in Indonesia. The transliteration manual process commonly requires philologists and takes a relatively long time. The automatic transliteration process is expected to shorten the time so as to help the works of philologists. The preprocessing and segmentation stage firstly done is used to manage the document images, thus obtaining image script units that will compile input document images free from noise and have the similarity in properties in the thickness, size, and slope. The next stage of characteristic extraction is used to find unique characteristics that will distinguish each Javanese script image. One of characteristics that is used in this research is the number of black pixels in each image units. Each image of Java scripts contained in the data training will undergo the same process similar to the input characters. The system testing was performed with the data of the book Hamong Tani. The book Hamong Tani was selected due to its content, age and number of pages. Those were considered sufficient as a model experimental input. Based on the results of random page automatic transliteration process testing, it was determined that the maximum percentage correctness obtained was 81.53%. The percentage of success was obtained in 32x32 pixel input image size with the 5x5 image window. With regard to the results, it can be concluded that the automatic transliteration model offered is relatively good.Keywords: Javanese script, character recognition, statistical, automatic transliteration
Procedia PDF Downloads 339805 Outwrestling Cataclysmic Tsunamis at Hilo, Hawaii: Using Technical Developments of the past 50 Years to Improve Performance
Authors: Mark White
Abstract:
The best practices for owners and urban planners to manage tsunami risk have evolved during the last fifty years, and related technical advances have created opportunities for them to obtain better performance than in earlier cataclysmic tsunami inundations. This basic pattern is illustrated at Hilo Bay, the waterfront area of Hilo, Hawaii, an urban seaport which faces the most severe tsunami hazard of the Hawaiian archipelago. Since April 1, 1946, Hilo Bay has endured tsunami waves with a maximum water height exceeding 2.5 meters following four severe earthquakes: Unimak Island (Mw 8.6, 6.1 m) in 1946; Valdiva (Mw 9.5, the largest earthquake of the 20th century, 10.6 m) in 1960; William Prince Sound (Mw 9.2, 3.8 m) in 1964; and Kalapana (Mw 7.7, the largest earthquake in Hawaii since 1868, 2.6 m) in 1975. Ignoring numerous smaller tsunamis during the same time frame, these four cataclysmic tsunamis have caused property losses in Hilo to exceed $1.25 billion and more than 150 deaths. It is reasonable to foresee another cataclysmic tsunami inundating the urban core of Hilo in the next 50 years, which, if unchecked, could cause additional deaths and losses in the hundreds of millions of dollars. Urban planners and individual owners are now in a position to reduce these losses in the next foreseeable tsunami that generates maximum water heights between 2.5 and 10 meters in Hilo Bay. Since 1946, Hilo planners and individual owners have already created buffer zones between the shoreline and its historic downtown area. As these stakeholders make inevitable improvements to the built environment along and adjacent to the shoreline, they should incorporate new methods for better managing the obvious tsunami risk at Hilo. At the planning level, new manmade land forms, such as tsunami parks and inundation reservoirs, should be developed. Individual owners should require their design professionals to include sacrificial seismic and tsunami fuses that will perform well in foreseeable severe events and that can be easily repaired in the immediate aftermath. These investments before the next cataclysmic tsunami at Hilo will yield substantial reductions in property losses and fatalities.Keywords: hilo, tsunami parks, reservoirs, fuse systems, risk managment
Procedia PDF Downloads 165804 Stereotyping of Non-Western Students in Western Universities: Applying Critical Discourse Analysis to Undermine Educational Hegemony
Authors: Susan Lubbers
Abstract:
This study applies critical discourse analysis to the language used by educators to frame international students of Asian backgrounds in Anglo-Western universities as quiet, shy, passive and unable to think critically. Emphasis is on the self-promoted ‘internationalised’ Australian tertiary context, where negative stereotypes are commonly voiced not only in the academy but also in the media. Parallels are drawn as well with other Anglo-Western educational contexts. The study critically compares the discourse of these persistent negative stereotypes, with in-class and interview discourses of international students of Asian and Western language, cultural and educational backgrounds enrolled in a Media and Popular Culture unit in an Australian university. The focus of analysis of the student discourse is on their engagement in critical dialogic interactions on the topics of culture and interculturality. The evidence is also drawn from student interviews and focus groups and from observation of whole-class discussion participation rates. The findings of the research project provide evidence that counters the myth of student as problem. They point rather to the widespread lack of intercultural awareness of Western educators and students as being at the heart of the negative perceptions of students of Asian backgrounds. The study suggests the efficacy of an approach to developing intercultural competence that is embedded, or integrated, into tertiary programs. The presentation includes an overview of the main strategies that have been developed by the tertiary educator (author) to support the development of intercultural competence of and among the student cohort. The evidence points to the importance of developing intercultural competence among tertiary educators and students. The failure by educators to ensure that the diverse voices, ideas and perspectives of students from all cultural, educational and language backgrounds are heard in our classrooms means that our universities can hardly be regarded or promoted as genuinely internationalised. They will continue as undemocratic institutions that perpetrate persistent Western educational hegemony.Keywords: critical discourse analysis, critical thinking, embedding, intercultural competence, interculturality, international student, internationalised education
Procedia PDF Downloads 292803 Performance Evaluation and Comparison between the Empirical Mode Decomposition, Wavelet Analysis, and Singular Spectrum Analysis Applied to the Time Series Analysis in Atmospheric Science
Authors: Olivier Delage, Hassan Bencherif, Alain Bourdier
Abstract:
Signal decomposition approaches represent an important step in time series analysis, providing useful knowledge and insight into the data and underlying dynamics characteristics while also facilitating tasks such as noise removal and feature extraction. As most of observational time series are nonlinear and nonstationary, resulting of several physical processes interaction at different time scales, experimental time series have fluctuations at all time scales and requires the development of specific signal decomposition techniques. Most commonly used techniques are data driven, enabling to obtain well-behaved signal components without making any prior-assumptions on input data. Among the most popular time series decomposition techniques, most cited in the literature, are the empirical mode decomposition and its variants, the empirical wavelet transform and singular spectrum analysis. With increasing popularity and utility of these methods in wide ranging applications, it is imperative to gain a good understanding and insight into the operation of these algorithms. In this work, we describe all of the techniques mentioned above as well as their ability to denoise signals, to capture trends, to identify components corresponding to the physical processes involved in the evolution of the observed system and deduce the dimensionality of the underlying dynamics. Results obtained with all of these methods on experimental total ozone columns and rainfall time series will be discussed and comparedKeywords: denoising, empirical mode decomposition, singular spectrum analysis, time series, underlying dynamics, wavelet analysis
Procedia PDF Downloads 118802 Quantification of Hydrogen Sulfide and Methyl Mercaptan in Air Samples from a Waste Management Facilities
Authors: R. F. Vieira, S. A. Figueiredo, O. M. Freitas, V. F. Domingues, C. Delerue-Matos
Abstract:
The presence of sulphur compounds like hydrogen sulphide and mercaptans is one of the reasons for waste-water treatment and waste management being associated with odour emissions. In this context having a quantifying method for these compounds helps in the optimization of treatment with the goal of their elimination, namely biofiltration processes. The aim of this study was the development of a method for quantification of odorous gases in waste treatment plants air samples. A method based on head space solid phase microextraction (HS-SPME) coupled with gas chromatography - flame photometric detector (GC-FPD) was used to analyse H2S and Metil Mercaptan (MM). The extraction was carried out with a 75-μm Carboxen-polydimethylsiloxane fiber coating at 22 ºC for 20 min, and analysed by a GC 2010 Plus A from Shimadzu with a sulphur filter detector: splitless mode (0.3 min), the column temperature program was from 60 ºC, increased by 15 ºC/min to 100 ºC (2 min). The injector temperature was held at 250 ºC, and the detector at 260 ºC. For calibration curve a gas diluter equipment (digital Hovagas G2 - Multi Component Gas Mixer) was used to do the standards. This unit had two input connections, one for a stream of the dilute gas and another for a stream of nitrogen and an output connected to a glass bulb. A 40 ppm H2S and a 50 ppm MM cylinders were used. The equipment was programmed to the selected concentration, and it automatically carried out the dilution to the glass bulb. The mixture was left flowing through the glass bulb for 5 min and then the extremities were closed. This method allowed the calibration between 1-20 ppm for H2S and 0.02-0.1 ppm and 1-3.5 ppm for MM. Several quantifications of air samples from inlet and outlet of a biofilter operating in a waste management facility in the north of Portugal allowed the evaluation the biofilters performance.Keywords: biofiltration, hydrogen sulphide, mercaptans, quantification
Procedia PDF Downloads 476801 Identification and Isolation of E. Coli O₁₅₇:H₇ From Water and Wastewater of Shahrood and Neka Cities by PCR Technique
Authors: Aliasghar Golmohammadian, Sona Rostampour Yasouri
Abstract:
One of the most important intestinal pathogenic strains is E. coli O₁₅₇:H₇. This pathogenic bacterium is transmitted to humans through water and food. E. coli O₁₅₇:H₇ is the main cause of Hemorrhagic colitis (HC), Hemolytic Uremic Syndrome (HUS), Thrombotic Thrombocytopenic Purpura (TTP) and in some cases death. Since E. coli O₁₅₇:H₇ can be transmitted through the consumption of different foods, including vegetables, agricultural products, and fresh dairy products, this study aims to identify and isolate E. coli O₁₅₇:H₇ from wastewater by PCR technique. One hundred twenty samples of water and wastewater were collected by Falcom Sterile from Shahrood and Neka cities. The samples were checked for colony formation after appropriate centrifugation and cultivation in the specific medium of Sorbitol MacConkey Agar (SMAC) and other diagnostic media of E. coli O₁₅₇:H₇. Also, the plates were observed macroscopically and microscopically. Then, the necessary phenotypic tests were performed on the colonies, and finally, after DNA extraction, the PCR technique was performed with specific primers related to rfbE and stx2 genes. The number of 5 samples (6%) out of all the samples examined were determined positive by PCR technique with observing the bands related to the mentioned genes on the agarose gel electrophoresis. PCR is a fast and accurate method to identify the bacteria E. coli O₁₅₇:H₇. Considering that E. coli bacteria is a resistant bacteria and survives in water and food for weeks and months, the PCR technique can provide the possibility of quick detection of contaminated water. Moreover, it helps people in the community control and prevent the transfer of bacteria to healthy and underground water and agricultural and even dairy products.Keywords: E. coli O₁₅₇:H₇, PCR, water, wastewater
Procedia PDF Downloads 65800 Effects of Different Meteorological Variables on Reference Evapotranspiration Modeling: Application of Principal Component Analysis
Authors: Akinola Ikudayisi, Josiah Adeyemo
Abstract:
The correct estimation of reference evapotranspiration (ETₒ) is required for effective irrigation water resources planning and management. However, there are some variables that must be considered while estimating and modeling ETₒ. This study therefore determines the multivariate analysis of correlated variables involved in the estimation and modeling of ETₒ at Vaalharts irrigation scheme (VIS) in South Africa using Principal Component Analysis (PCA) technique. Weather and meteorological data between 1994 and 2014 were obtained both from South African Weather Service (SAWS) and Agricultural Research Council (ARC) in South Africa for this study. Average monthly data of minimum and maximum temperature (°C), rainfall (mm), relative humidity (%), and wind speed (m/s) were the inputs to the PCA-based model, while ETₒ is the output. PCA technique was adopted to extract the most important information from the dataset and also to analyze the relationship between the five variables and ETₒ. This is to determine the most significant variables affecting ETₒ estimation at VIS. From the model performances, two principal components with a variance of 82.7% were retained after the eigenvector extraction. The results of the two principal components were compared and the model output shows that minimum temperature, maximum temperature and windspeed are the most important variables in ETₒ estimation and modeling at VIS. In order words, ETₒ increases with temperature and windspeed. Other variables such as rainfall and relative humidity are less important and cannot be used to provide enough information about ETₒ estimation at VIS. The outcome of this study has helped to reduce input variable dimensionality from five to the three most significant variables in ETₒ modelling at VIS, South Africa.Keywords: irrigation, principal component analysis, reference evapotranspiration, Vaalharts
Procedia PDF Downloads 258799 A Literature Review: The Anti-Obesity Effect of Epigallocathecin-3-Gallate of Camellia sinensis (Green Tea) Extraction as a Potential Adjuvant Therapy for Management Obesity
Authors: Nunuy Nuraeni, Vera Amalia Lestari, Atri Laranova, Viena Nissa Mien Fadhillah, Mutia, Muhammad Ikhlas Abdian Putra
Abstract:
Introduction: Obesity is a common disease with high prevalence especially in developing countries including Indonesia. The obesitygenic lifestyle such as excessive intake of food, sedentary lifestyle is the major environmental etiologies of obesity. Obesity is also as one of burden disease with high morbidity due to its complication, such as diabetes mellitus and hypertension. The objective of this literature review is to know how the Epigallocathecin-3-Gallate of Green tea or Camellia sinensis effect as anti-obesity agent and reduce the complication of obesity. Material and Methods: This study based on the secondary data analysis complemented by primary data collection from several journal and textbook. We identified the effect of Epigallocathecin-3-Gallate of Green tea or Camellia sinensis as adjuvant therapy for management obesity and to prevent the complications of obesity. Results: Based on the result, Green tea or Camellia sinensis contain Epigallocathecin-3-Gallate (EGCG) that has anti-obesity effect such as induce apoptosis, inhibit adipogenesis, increasing lipolytic activity, increasing fat oxidation and thermogenesis. Discussion: EGCG are naturally distributed in green tea, that contains a biological activity that has a potential effect to treat obesity. Conclusion: EGCG are capable to treat obesity. By consuming EGCG can prevent obesity in normal health person and prevent complication in patient with obesity.Keywords: adjuvant therapy, anti-obesity effect, complication, epigallocathecin-3-gallate, obesity
Procedia PDF Downloads 279