Search results for: real anthropometric database
5410 Estimation of State of Charge, State of Health and Power Status for the Li-Ion Battery On-Board Vehicle
Authors: S. Sabatino, V. Calderaro, V. Galdi, G. Graber, L. Ippolito
Abstract:
Climate change is a rapidly growing global threat caused mainly by increased emissions of carbon dioxide (CO₂) into the atmosphere. These emissions come from multiple sources, including industry, power generation, and the transport sector. The need to tackle climate change and reduce CO₂ emissions is indisputable. A crucial solution to achieving decarbonization in the transport sector is the adoption of electric vehicles (EVs). These vehicles use lithium (Li-Ion) batteries as an energy source, making them extremely efficient and with low direct emissions. However, Li-Ion batteries are not without problems, including the risk of overheating and performance degradation. To ensure its safety and longevity, it is essential to use a battery management system (BMS). The BMS constantly monitors battery status, adjusts temperature and cell balance, ensuring optimal performance and preventing dangerous situations. From the monitoring carried out, it is also able to optimally manage the battery to increase its life. Among the parameters monitored by the BMS, the main ones are State of Charge (SoC), State of Health (SoH), and State of Power (SoP). The evaluation of these parameters can be carried out in two ways: offline, using benchtop batteries tested in the laboratory, or online, using batteries installed in moving vehicles. Online estimation is the preferred approach, as it relies on capturing real-time data from batteries while operating in real-life situations, such as in everyday EV use. Actual battery usage conditions are highly variable. Moving vehicles are exposed to a wide range of factors, including temperature variations, different driving styles, and complex charge/discharge cycles. This variability is difficult to replicate in a controlled laboratory environment and can greatly affect performance and battery life. Online estimation captures this variety of conditions, providing a more accurate assessment of battery behavior in real-world situations. In this article, a hybrid approach based on a neural network and a statistical method for real-time estimation of SoC, SoH, and SoP parameters of interest is proposed. These parameters are estimated from the analysis of a one-day driving profile of an electric vehicle, assumed to be divided into the following four phases: (i) Partial discharge (SoC 100% - SoC 50%), (ii) Partial discharge (SoC 50% - SoC 80%), (iii) Deep Discharge (SoC 80% - SoC 30%) (iv) Full charge (SoC 30% - SoC 100%). The neural network predicts the values of ohmic resistance and incremental capacity, while the statistical method is used to estimate the parameters of interest. This reduces the complexity of the model and improves its prediction accuracy. The effectiveness of the proposed model is evaluated by analyzing its performance in terms of square mean error (RMSE) and percentage error (MAPE) and comparing it with the reference method found in the literature.Keywords: electric vehicle, Li-Ion battery, BMS, state-of-charge, state-of-health, state-of-power, artificial neural networks
Procedia PDF Downloads 695409 Visual Inspection of Road Conditions Using Deep Convolutional Neural Networks
Authors: Christos Theoharatos, Dimitris Tsourounis, Spiros Oikonomou, Andreas Makedonas
Abstract:
This paper focuses on the problem of visually inspecting and recognizing the road conditions in front of moving vehicles, targeting automotive scenarios. The goal of road inspection is to identify whether the road is slippery or not, as well as to detect possible anomalies on the road surface like potholes or body bumps/humps. Our work is based on an artificial intelligence methodology for real-time monitoring of road conditions in autonomous driving scenarios, using state-of-the-art deep convolutional neural network (CNN) techniques. Initially, the road and ego lane are segmented within the field of view of the camera that is integrated into the front part of the vehicle. A novel classification CNN is utilized to identify among plain and slippery road textures (e.g., wet, snow, etc.). Simultaneously, a robust detection CNN identifies severe surface anomalies within the ego lane, such as potholes and speed bumps/humps, within a distance of 5 to 25 meters. The overall methodology is illustrated under the scope of an integrated application (or system), which can be integrated into complete Advanced Driver-Assistance Systems (ADAS) systems that provide a full range of functionalities. The outcome of the proposed techniques present state-of-the-art detection and classification results and real-time performance running on AI accelerator devices like Intel’s Myriad 2/X Vision Processing Unit (VPU).Keywords: deep learning, convolutional neural networks, road condition classification, embedded systems
Procedia PDF Downloads 1355408 Interdisciplinary Approach in Vocational Training for Orthopaedic Surgery
Authors: Mihail Nagea, Olivera Lupescu, Elena Taina Avramescu, Cristina Patru
Abstract:
Classical education of orthopedic surgeons involves lectures, self study, workshops and cadaver dissections, and sometimes supervised practical training within surgery, which quite seldom gives the young surgeons the feeling of being unable to apply what they have learned especially in surgical practice. The purpose of this paper is to present a different approach from the classical one, which enhances the practical skills of the orthopedic trainees and prepare them for future practice. The paper presents the content of the research project 2015-1-RO01-KA202-015230, ERASMUS+ VET ‘Collaborative learning for enhancing practical skills for patient-focused interventions in gait rehabilitation after orthopedic surgery’ which, using e learning as a basic tool , delivers to the trainees not only courses, but especially practical information through videos and case scenarios including gait analysis in order to build patient focused therapeutic plans, adapted to the characteristics of each patient. The outcome of this project is to enhance the practical skills in orthopedic surgery and the results are evaluated following the answers to the questionnaires, but especially the reactions within the case scenarios. The participants will thus follow the idea that any mistake within solving the cases might represent a failure of treating a real patient. This modern approach, besides using interactivity to evaluate the theoretical and practical knowledge of the trainee, increases the sense of responsibility, as well as the ability to react properly in real cases.Keywords: interdisciplinary approach, gait analysis, orthopedic surgery, vocational training
Procedia PDF Downloads 2515407 Processes and Application of Casting Simulation and Its Software’s
Authors: Surinder Pal, Ajay Gupta, Johny Khajuria
Abstract:
Casting simulation helps visualize mold filling and casting solidification; predict related defects like cold shut, shrinkage porosity and hard spots; and optimize the casting design to achieve the desired quality with high yield. Flow and solidification of molten metals are, however, a very complex phenomenon that is difficult to simulate correctly by conventional computational techniques, especially when the part geometry is intricate and the required inputs (like thermo-physical properties and heat transfer coefficients) are not available. Simulation software is based on the process of modeling a real phenomenon with a set of mathematical formulas. It is, essentially, a program that allows the user to observe an operation through simulation without actually performing that operation. Simulation software is used widely to design equipment so that the final product will be as close to design specs as possible without expensive in process modification. Simulation software with real-time response is often used in gaming, but it also has important industrial applications. When the penalty for improper operation is costly, such as airplane pilots, nuclear power plant operators, or chemical plant operators, a mockup of the actual control panel is connected to a real-time simulation of the physical response, giving valuable training experience without fear of a disastrous outcome. The all casting simulation software has own requirements, like magma cast has only best for crack simulation. The latest generation software Auto CAST developed at IIT Bombay provides a host of functions to support method engineers, including part thickness visualization, core design, multi-cavity mold design with common gating and feeding, application of various feed aids (feeder sleeves, chills, padding, etc.), simulation of mold filling and casting solidification, automatic optimization of feeders and gating driven by the desired quality level, and what-if cost analysis. IIT Bombay has developed a set of applications for the foundry industry to improve casting yield and quality. Casting simulation is a fast and efficient solution for process for advanced tool which is the result of more than 20 years of collaboration with major industrial partners and academic institutions around the world. In this paper the process of casting simulation is studied.Keywords: casting simulation software’s, simulation technique’s, casting simulation, processes
Procedia PDF Downloads 4765406 Food Consumption Pattern and Other Associated Factors of Overweight/Obesity and the Prevalence of Dysglyceamia/Diabetes among Employees Attached to the Ministry of Economic Development
Authors: G. S. Sumanasekara, A. Balasuriya
Abstract:
Introduction: In Sri Lanka studies reveal higher trend in prevalence of diabetes. The office employees have sedentary life style and their eating patterns changed due to nutritional transition. Further overall, urban and rural pre diabetes is also increasing. Objectives - Study the general food pattern of office employees and its relation to overweight/obesity and prevalence of diabetes among them. Method: The data was collected from office employees between 30-60 years (n-400).Data analyzed using SPSS 16 version.The Study design was a descriptive cross sectional study. The study setting was Ministry of Economic Development. Anthropometric measurements and blood glucose assessed by trained nurses. Dietary pattern was studied through a food frequency questionairre thereby calculated daily nutrient intakes. Results: Mean age of office employees were 38.98 SD (7.033) CI=95%) and 245 females (61.2%) 155 males (38.8 %) ,Nationality includes Sinhala (67.5%), Tamil(20%), and Muslims (12.5%).Owerweight(7,1.8%), obese male(36,9%), obese female(66,16%)/ diabetes/obese(18,4.5%) out of 127(31.8%) who were above the normal BMI whereas 273(68.2) were within the normal. Mean BMI was 24.1593.Mean Blood sugar level was 104.646,SD(16.018).12% consume tobacco products,17.8 consumed alcohol.15.8% had nutrition training. Two main dietary patterns identified who were vegetarians and non vegetarians .Mean energy intake 1727.1, (SD 4.97), Mean protein consumption(11.33, SD 1.811), Mean fat consumption(24.07, SD 4.131),Mean CHO consumption (64.56, SD 4.54), Mean Fibre (30.05, SD 17.9), Mean cholesterol(16.85, SD 17.22), Energy intake was higher in non vegetarians and larger propotion of energy derived from proteins , and fat. Their carbohydrate and cholesterol intake was also higher. Tamils were mostly vegetarians. Mainly BMI were within normal range(18.5-23.5) whereas Muslims who had higher energy intakes showed BMI above the normal. Conclusion: Two distinct dietary patterns identified. Different ethnic groups consume different diets with different nutrient composition. Dietary pattern has a relation to overweight. Overweight related to high blood glucose levels but some overweight subjects do not show any relation.Keywords: obesity, overweight, diabetes, dietary pattern, nutrition, BMI, non communicable disease
Procedia PDF Downloads 3075405 Non-Parametric, Unconditional Quantile Estimation of Efficiency in Microfinance Institutions
Authors: Komlan Sedzro
Abstract:
We apply the non-parametric, unconditional, hyperbolic order-α quantile estimator to appraise the relative efficiency of Microfinance Institutions in Africa in terms of outreach. Our purpose is to verify if these institutions, which must constantly try to strike a compromise between their social role and financial sustainability are operationally efficient. Using data on African MFIs extracted from the Microfinance Information eXchange (MIX) database and covering the 2004 to 2006 periods, we find that more efficient MFIs are also the most profitable. This result is in line with the view that social performance is not in contradiction with the pursuit of excellent financial performance. Our results also show that large MFIs in terms of asset and those charging the highest fees are not necessarily the most efficient.Keywords: data envelopment analysis, microfinance institutions, quantile estimation of efficiency, social and financial performance
Procedia PDF Downloads 3115404 Prevalence and Associated Factors of Protein-Energy Malnutrition Among Children Aged 6-59 Months in Babile Town from April to June 2016
Authors: Tajudin Ahmed
Abstract:
Malnutrition is a significant problem in developing countries, particularly among children, due to inadequate diets, lack of proper care, and unequal distribution of food within households. High rates of malnutrition have been shown in Ethiopia, including stunting, underweight, and wasting. This study aims to assess the prevalence and associated factors of Protein-Energy Malnutrition (PEM) among children aged 6-59 months in Babile Town. The study utilized a community-based cross-sectional design conducted in Babile Town, Eastern Ethiopia. Two kebeles were randomly selected, and a census was conducted to identify eligible households. A total of 391 households with children aged 6-59 months were included in the study. Data was collected using structured questionnaires, and anthropometric measurements were taken to assess the weight and height of the children. The study found that a majority of the mothers (72.34%) and fathers (43%) had no formal education. Among the mothers who could read and write, a small percentage had completed primary (14%) or secondary (14%) education, and even fewer had higher education (2.7%). Similarly, among the fathers who could read and write, a majority had completed primary (46.15%) or secondary (27.22%) education, with smaller percentages completing preparatory (8.4%) or higher education (6.29%). The prevalence of malnutrition in the study area was high, with 38.85% of children experiencing stunting (8.2% severely stunted), 50.13% wasting (9% severely wasted), and 41.43% underweight (6.65% severely underweight). These findings indicate a significant burden of malnutrition in Babile Town, likely exacerbated by the high prevalence of infectious diseases such as diarrhea. The study concludes that the prevalence of malnutrition, particularly stunting, wasting, and underweight, is high in Babile Town. The findings indicate the urgent need for interventions to address malnutrition and improve nutrition and healthcare practices in the study area. These results can serve as a baseline for future studies and inform policymakers and healthcare providers in their efforts to combat childhood malnutrition.Keywords: protein-energy malnutrition, children 6-59 month age babble town, Marasmus
Procedia PDF Downloads 585403 The Impact of the Number of Neurons in the Hidden Layer on the Performance of MLP Neural Network: Application to the Fast Identification of Toxics Gases
Authors: Slimane Ouhmad, Abdellah Halimi
Abstract:
In this work, we have applied neural networks method MLP type to a database from an array of six sensors for the detection of three toxic gases. As the choice of the number of hidden layers and the weight values has a great influence on the convergence of the learning algorithm, we proposed, in this article, a mathematical formulation to determine the optimal number of hidden layers and good weight values based on the method of back propagation of errors. The results of this modeling have improved discrimination of these gases on the one hand, and optimize the computation time on the other hand, the comparison to other results achieved in this case.Keywords: MLP Neural Network, back-propagation, number of neurons in the hidden layer, identification, computing time
Procedia PDF Downloads 3495402 Predicting Groundwater Areas Using Data Mining Techniques: Groundwater in Jordan as Case Study
Authors: Faisal Aburub, Wael Hadi
Abstract:
Data mining is the process of extracting useful or hidden information from a large database. Extracted information can be used to discover relationships among features, where data objects are grouped according to logical relationships; or to predict unseen objects to one of the predefined groups. In this paper, we aim to investigate four well-known data mining algorithms in order to predict groundwater areas in Jordan. These algorithms are Support Vector Machines (SVMs), Naïve Bayes (NB), K-Nearest Neighbor (kNN) and Classification Based on Association Rule (CBA). The experimental results indicate that the SVMs algorithm outperformed other algorithms in terms of classification accuracy, precision and F1 evaluation measures using the datasets of groundwater areas that were collected from Jordanian Ministry of Water and Irrigation.Keywords: classification, data mining, evaluation measures, groundwater
Procedia PDF Downloads 2815401 A Service-Learning Experience in the Subject of Adult Nursing
Authors: Eva de Mingo-Fernández, Lourdes Rubio Rico, Carmen Ortega-Segura, Montserrat Querol-García, Raúl González-Jauregui
Abstract:
Today, one of the great challenges that the university faces is to get closer to society and transfer knowledge. The competency-based training approach favours a continuous interaction between practice and theory, which is why it is essential to establish real experiences with reflection and debate and to contrast them with personal and professional knowledge. Service-learning (SL) consists of an integration of academic learning with service in the community, which enables teachers to transfer knowledge with social value and students to be trained on the basis of experience of real needs and problems with the aim of solving them. SLE combines research, teaching, and social value knowledge transfer with the real social needs and problems of a community. Goal: The objective of this study was to design, implement, and evaluate a service-learning program in the subject of adult nursing for second-year nursing students. Methodology: After establishing collaboration with eight associations of people with different pathologies, the students were divided into eight groups, and each group was assigned an association. The groups were made up of 10-12 students. The associations willing to participate were for the following conditions: diabetes, multiple sclerosis, cancer, inflammatory bowel disease, fibromyalgia, heart, lung, and kidney diseases. The methodological design consisting of 5 activities was then applied. Three activities address personal and individual reflections, where the student initially describes what they think it is like to live with a certain disease. They then express their reflections resulting from an interview conducted by peers, in person or online, with a person living with this particular condition, and after sharing the results of their reflections with the rest of the group, they make an oral presentation in which they present their findings to the other students. This is followed by a service task in which the students collaborate in different activities of the association, and finally, a third individual reflection is carried out in which the students express their experience of collaboration. The evaluation of this activity is carried out by means of a rubric for both the reflections and the presentation. It should be noted that the oral presentation is evaluated both by the rest of the classmates and by the teachers. Results: The evaluation of the activity, given by the students, is 7.80/10, commenting that the experience is positive and brings them closer to the reality of the people and the area.Keywords: academic learning integration, knowledge transfer, service-learning, teaching methodology
Procedia PDF Downloads 735400 Impact of Financial Performance Indicators on Share Price of Listed Pharmaceutical Companies in India
Authors: Amit Das
Abstract:
Background and significance of the study: Generally investors and market forecasters use financial statement for investigation while it awakens contribute to investing. The main vicinity of financial accounting and reporting practices recommends a few basic financial performance indicators, namely, return on capital employed, return on assets and earnings per share, which is associated considerably with share prices. It is principally true in case of Indian pharmaceutical companies also. Share investing is intriguing a financial risk in addition to investors look for those financial evaluations which have noteworthy shock on share price. A crucial intention of financial statement analysis and reporting is to offer information which is helpful predominantly to exterior clients in creating credit as well as investment choices. Sound financial performance attracts the investors automatically and it will increase the share price of the respective companies. Keeping in view of this, this research work investigates the impact of financial performance indicators on share price of pharmaceutical companies in India which is listed in the Bombay Stock Exchange. Methodology: This research work is based on secondary data collected from moneycontrol database on September 28, 2015 of top 101 pharmaceutical companies in India. Since this study selects four financial performance indicators purposively and availability in the database, that is, earnings per share, return on capital employed, return on assets and net profits as independent variables and one dependent variable, share price of 101 pharmaceutical companies. While analysing the data, correlation statistics, multiple regression technique and appropriate test of significance have been used. Major findings: Correlation statistics show that four financial performance indicators of 101 pharmaceutical companies are associated positively and negatively with its share price and it is very much significant that more than 80 companies’ financial performances are related positively. Multiple correlation test results indicate that financial performance indicators are highly related with share prices of the selected pharmaceutical companies. Furthermore, multiple regression test results illustrate that when financial performances are good, share prices have been increased steadily in the Bombay stock exchange and all results are statistically significant. It is more important to note that sensitivity indices were changed slightly through financial performance indicators of selected pharmaceutical companies in India. Concluding statements: The share prices of pharmaceutical companies depend on the sound financial performances. It is very clear that share prices are changed with the movement of two important financial performance indicators, that is, earnings per share and return on assets. Since 101 pharmaceutical companies are listed in the Bombay stock exchange and Sensex are changed with this, it is obvious that Government of India has to take important decisions regarding production and exports of pharmaceutical products so that financial performance of all the pharmaceutical companies are improved and its share price are increased positively.Keywords: financial performance indicators, share prices, pharmaceutical companies, India
Procedia PDF Downloads 3065399 Salmonella Emerging Serotypes in Northwestern Italy: Genetic Characterization by Pulsed-Field Gel Electrophoresis
Authors: Clara Tramuta, Floris Irene, Daniela Manila Bianchi, Monica Pitti, Giulia Federica Cazzaniga, Lucia Decastelli
Abstract:
This work presents the results obtained by the Regional Reference Centre for Salmonella Typing (CeRTiS) in a retrospective study aimed to investigate, through Pulsed-field Gel Electrophoresis (PFGE) analysis, the genetic relatedness of emerging Salmonella serotypes of human origin circulating in North-West of Italy. Furthermore, the goal of this work was to create a Regional database to facilitate foodborne outbreak investigation and to monitor them at an earlier stage. A total of 112 strains, isolated from 2016 to 2018 in hospital laboratories, were included in this study. The isolates were previously identified as Salmonella according to standard microbiological techniques and serotyping was performed according to ISO 6579-3 and the Kaufmann-White scheme using O and H antisera (Statens Serum Institut®). All strains were characterized by PFGE: analysis was conducted according to a standardized PulseNet protocol. The restriction enzyme XbaI was used to generate several distinguishable genomic fragments on the agarose gel. PFGE was performed on a CHEF Mapper system, separating large fragments and generating comparable genetic patterns. The agarose gel was then stained with GelRed® and photographed under ultraviolet transillumination. The PFGE patterns obtained from the 112 strains were compared using Bionumerics version 7.6 software with the Dice coefficient with 2% band tolerance and 2% optimization. For each serotype, the data obtained with the PFGE were compared according to the geographical origin and the year in which they were isolated. Salmonella strains were identified as follow: S. Derby n. 34; S. Infantis n. 38; S. Napoli n. 40. All the isolates had appreciable restricted digestion patterns ranging from approximately 40 to 1100 kb. In general, a fairly heterogeneous distribution of pulsotypes has emerged in the different provinces. Cluster analysis indicated high genetic similarity (≥ 83%) among strains of S. Derby (n. 30; 88%), S. Infantis (n. 36; 95%) and S. Napoli (n. 38; 95%) circulating in north-western Italy. The study underlines the genomic similarities shared by the emerging Salmonella strains in Northwest Italy and allowed to create a database to detect outbreaks in an early stage. Therefore, the results confirmed that PFGE is a powerful and discriminatory tool to investigate the genetic relationships among strains in order to monitoring and control Salmonellosis outbreak spread. Pulsed-field gel electrophoresis (PFGE) still represents one of the most suitable approaches to characterize strains, in particular for the laboratories for which NGS techniques are not available.Keywords: emerging Salmonella serotypes, genetic characterization, human strains, PFGE
Procedia PDF Downloads 1095398 The Effect of LEADER and Community-Led Local Development in Spanish Municipal Unemployment: A Difference-in-Difference Approach
Authors: Miguel A. Borrella, Ana P. Fanjul, Suca Munoz, Liliana Herrera
Abstract:
This paper evaluates the impact of LEADER, a remarkable Community-Led Local Development (CLLD) approach of the European Program for Rural Development applied to rural municipalities of Spain in 2018 and 2019. Using a difference-in-difference estimation strategy and a newly-constructed database, results show that aided municipalities have significantly lower unemployment levels than non-aided municipalities. Results are significant for the decrease in unemployment for both women and people younger than 25 years old, two of the target groups of the policy. Nevertheless, they are larger for male and older workers. Therefore, findings suggest that LEADER 2017-2018 was successful in reducing unemployment in rural areas.Keywords: community-led local development, ex-post evaluation, LEADER, rural development
Procedia PDF Downloads 3415397 A Picture is worth a Billion Bits: Real-Time Image Reconstruction from Dense Binary Pixels
Authors: Tal Remez, Or Litany, Alex Bronstein
Abstract:
The pursuit of smaller pixel sizes at ever increasing resolution in digital image sensors is mainly driven by the stringent price and form-factor requirements of sensors and optics in the cellular phone market. Recently, Eric Fossum proposed a novel concept of an image sensor with dense sub-diffraction limit one-bit pixels (jots), which can be considered a digital emulation of silver halide photographic film. This idea has been recently embodied as the EPFL Gigavision camera. A major bottleneck in the design of such sensors is the image reconstruction process, producing a continuous high dynamic range image from oversampled binary measurements. The extreme quantization of the Poisson statistics is incompatible with the assumptions of most standard image processing and enhancement frameworks. The recently proposed maximum-likelihood (ML) approach addresses this difficulty, but suffers from image artifacts and has impractically high computational complexity. In this work, we study a variant of a sensor with binary threshold pixels and propose a reconstruction algorithm combining an ML data fitting term with a sparse synthesis prior. We also show an efficient hardware-friendly real-time approximation of this inverse operator. Promising results are shown on synthetic data as well as on HDR data emulated using multiple exposures of a regular CMOS sensor.Keywords: binary pixels, maximum likelihood, neural networks, sparse coding
Procedia PDF Downloads 2045396 Food Composition Tables Used as an Instrument to Estimate the Nutrient Ingest in Ecuador
Authors: Ortiz M. Rocío, Rocha G. Karina, Domenech A. Gloria
Abstract:
There are several tools to assess the nutritional status of the population. A main instrument commonly used to build those tools is the food composition tables (FCT). Despite the importance of FCT, there are many error sources and variability factors that can be presented on building those tables and can lead to an under or over estimation of ingest of nutrients of a population. This work identified different food composition tables used as an instrument to estimate the nutrient ingest in Ecuador.The collection of data for choosing FCT was made through key informants –self completed questionnaires-, supplemented with institutional web research. A questionnaire with general variables (origin, year of edition, etc) and methodological variables (method of elaboration, information of the table, etc) was passed to the identified FCT. Those variables were defined based on an extensive literature review. A descriptive analysis of content was performed. Ten printed tables and three databases were reported which were all indistinctly treated as food composition tables. We managed to get information from 69% of the references. Several informants referred to printed documents that were not accessible. In addition, searching the internet was not successful. Of the 9 final tables, n=8 are from Latin America, and, n= 5 of these were constructed by indirect method (collection of already published data) having as a main source of information a database from the United States department of agriculture USDA. One FCT was constructed by using direct method (bromatological analysis) and has its origin in Ecuador. The 100% of the tables made a clear distinction of the food and its method of cooking, 88% of FCT expressed values of nutrients per 100g of edible portion, 77% gave precise additional information about the use of the table, and 55% presented all the macro and micro nutrients on a detailed way. The more complete FCT were: INCAP (Central America), Composition of foods (Mexico). The more referred table was: Ecuadorian food composition table of 1965 (70%). The indirect method was used for most tables within this study. However, this method has the disadvantage that it generates less reliable food composition tables because foods show variations in composition. Therefore, a database cannot accurately predict the composition of any isolated sample of a food product.In conclusion, analyzing the pros and cons, and, despite being a FCT elaborated by using an indirect method, it is considered appropriate to work with the FCT of INCAP Central America, given the proximity to our country and a food items list that is very similar to ours. Also, it is imperative to have as a reference the table of composition for Ecuadorian food, which, although is not updated, was constructed using the direct method with Ecuadorian foods. Hence, both tables will be used to elaborate a questionnaire with the purpose of assessing the food consumption of the Ecuadorian population. In case of having disparate values, we will proceed by taking just the INCAP values because this is an updated table.Keywords: Ecuadorian food composition tables, FCT elaborated by direct method, ingest of nutrients of Ecuadorians, Latin America food composition tables
Procedia PDF Downloads 4325395 Insights into the Perception of Sustainable Technology Adoption among Malaysian Small and Medium-Sized Enterprises
Authors: Majharul Talukder, Ali Quazi
Abstract:
The use of sustainable technology is being increasingly driven by the demand for saving resources, long-term cost savings, and protecting the environment. A transitional economy such as Malaysia is an example where traditional technologies are being replaced by sustainable ones. The antecedents that are driving Malaysian SMEs to integrate sustainable technology into their business operations have not been well researched. This paper addresses this gap in our knowledge through an examination of attitudes and ethics as antecedents of acceptance of sustainable technology among Malaysian SMEs. The database comprised 322 responses that were analysed using the PLS-SEM path algorithm. Results indicated that effective and altruism attitudes have high predictive ability for the usage of sustainable technology in Malaysian SMEs. This paper identifies the implications of the findings, along with the major limitations of the research and explores future areas of research in this field.Keywords: sustainable technology, innovation management, Malaysian SMEs, organizational attitudes and ethical belief
Procedia PDF Downloads 3345394 Texture-Based Image Forensics from Video Frame
Authors: Li Zhou, Yanmei Fang
Abstract:
With current technology, images and videos can be obtained more easily than ever. It is so easy to manipulate these digital multimedia information when obtained, and that the content or source of the image and video could be easily tampered. In this paper, we propose to identify the image and video frame by the texture-based approach, e.g. Markov Transition Probability (MTP), which is in space domain, DCT domain and DWT domain, respectively. In the experiment, image and video frame database is constructed, and is used to train and test the classifier Support Vector Machine (SVM). Experiment results show that the texture-based approach has good performance. In order to verify the experiment result, and testify the universality and robustness of algorithm, we build a random testing dataset, the random testing result is in keeping with above experiment.Keywords: multimedia forensics, video frame, LBP, MTP, SVM
Procedia PDF Downloads 4285393 Importance of Different Spatial Parameters in Water Quality Analysis within Intensive Agricultural Area
Authors: Marina Bubalo, Davor Romić, Stjepan Husnjak, Helena Bakić
Abstract:
Even though European Council Directive 91/676/EEC known as Nitrates Directive was adopted in 1991, the issue of water quality preservation in areas of intensive agricultural production still persist all over Europe. High nitrate nitrogen concentrations in surface and groundwater originating from diffuse sources are one of the most important environmental problems in modern intensive agriculture. The fate of nitrogen in soil, surface and groundwater in agricultural area is mostly affected by anthropogenic activity (i.e. agricultural practice) and hydrological and climatological conditions. The aim of this study was to identify impact of land use, soil type, soil vulnerability to pollutant percolation, and natural aquifer vulnerability to nitrate occurrence in surface and groundwater within an intensive agricultural area. The study was set in Varaždin County (northern Croatia), which is under significant influence of the large rivers Drava and Mura and due to that entire area is dominated by alluvial soil with shallow active profile mainly on gravel base. Negative agricultural impact on water quality in this area is evident therefore the half of selected county is a part of delineated nitrate vulnerable zones (NVZ). Data on water quality were collected from 7 surface and 8 groundwater monitoring stations in the County. Also, recent study of the area implied detailed inventory of agricultural production and fertilizers use with the aim to produce new agricultural land use database as one of dominant parameters. The analysis of this database done using ArcGIS 10.1 showed that 52,7% of total County area is agricultural land and 59,2% of agricultural land is used for intensive agricultural production. On the other hand, 56% of soil within the county is classified as soil vulnerable to pollutant percolation. The situation is similar with natural aquifer vulnerability; northern part of the county ranges from high to very high aquifer vulnerability. Statistical analysis of water quality data is done using SPSS 13.0. Cluster analysis group both surface and groundwater stations in two groups according to nitrate nitrogen concentrations. Mean nitrate nitrogen concentration in surface water – group 1 ranges from 4,2 to 5,5 mg/l and in surface water – group 2 from 24 to 42 mg/l. The results are similar, but evidently higher, in groundwater samples; mean nitrate nitrogen concentration in group 1 ranges from 3,9 to 17 mg/l and in group 2 from 36 to 96 mg/l. ANOVA analysis confirmed statistical significance between stations that are classified in the same group. The previously listed parameters (land use, soil type, etc.) were used in factorial correspondence analysis (FCA) to detect importance of each stated parameter in local water quality. Since stated parameters mostly cannot be altered, there is obvious necessity for more precise and more adapted land management in such conditions.Keywords: agricultural area, nitrate, factorial correspondence analysis, water quality
Procedia PDF Downloads 2595392 Distinct Patterns of Resilience Identified Using Smartphone Mobile Experience Sampling Method (M-ESM) and a Dual Model of Mental Health
Authors: Hussain-Abdulah Arjmand, Nikki S. Rickard
Abstract:
The response to stress can be highly heterogenous, and may be influenced by methodological factors. The integrity of data will be optimized by measuring both positive and negative affective responses to an event, by measuring responses in real time as close to the stressful event as possible, and by utilizing data collection methods that do not interfere with naturalistic behaviours. The aim of the current study was to explore short term prototypical responses to major stressor events on outcome measures encompassing both positive and negative indicators of psychological functioning. A novel mobile experience sampling methodology (m-ESM) was utilized to monitor both effective responses to stressors in real time. A smartphone mental health app (‘Moodprism’) which prompts users daily to report both their positive and negative mood, as well as whether any significant event had occurred in the past 24 hours, was developed for this purpose. A sample of 142 participants was recruited as part of the promotion of this app. Participants’ daily reported experience of stressor events, levels of depressive symptoms and positive affect were collected across a 30 day period as they used the app. For each participant, major stressor events were identified on the subjective severity of the event rated by the user. Depression and positive affect ratings were extracted for the three days following the event. Responses to the event were scaled relative to their general reactivity across the remainder of the 30 day period. Participants were first clustered into groups based on initial reactivity and subsequent recovery following a stressor event. This revealed distinct patterns of responding along depressive symptomatology and positive affect. Participants were then grouped based on allocations to clusters in each outcome variable. A highly individualised nature in which participants respond to stressor events, in symptoms of depression and levels of positive affect, was observed. A complete description of the novel profiles identified will be presented at the conference. These findings suggest that real-time measurement of both positive and negative functioning to stressors yields a more complex set of responses than previously observed with retrospective reporting. The use of smartphone technology to measure individualized responding also proved to shed significant insight.Keywords: depression, experience sampling methodology, positive functioning, resilience
Procedia PDF Downloads 2405391 Enhancing Financial Security: Real-Time Anomaly Detection in Financial Transactions Using Machine Learning
Authors: Ali Kazemi
Abstract:
The digital evolution of financial services, while offering unprecedented convenience and accessibility, has also escalated the vulnerabilities to fraudulent activities. In this study, we introduce a distinct approach to real-time anomaly detection in financial transactions, aiming to fortify the defenses of banking and financial institutions against such threats. Utilizing unsupervised machine learning algorithms, specifically autoencoders and isolation forests, our research focuses on identifying irregular patterns indicative of fraud within transactional data, thus enabling immediate action to prevent financial loss. The data we used in this study included the monetary value of each transaction. This is a crucial feature as fraudulent transactions may have distributions of different amounts than legitimate ones, such as timestamps indicating when transactions occurred. Analyzing transactions' temporal patterns can reveal anomalies (e.g., unusual activity in the middle of the night). Also, the sector or category of the merchant where the transaction occurred, such as retail, groceries, online services, etc. Specific categories may be more prone to fraud. Moreover, the type of payment used (e.g., credit, debit, online payment systems). Different payment methods have varying risk levels associated with fraud. This dataset, anonymized to ensure privacy, reflects a wide array of transactions typical of a global banking institution, ranging from small-scale retail purchases to large wire transfers, embodying the diverse nature of potentially fraudulent activities. By engineering features that capture the essence of transactions, including normalized amounts and encoded categorical variables, we tailor our data to enhance model sensitivity to anomalies. The autoencoder model leverages its reconstruction error mechanism to flag transactions that deviate significantly from the learned normal pattern, while the isolation forest identifies anomalies based on their susceptibility to isolation from the dataset's majority. Our experimental results, validated through techniques such as k-fold cross-validation, are evaluated using precision, recall, and the F1 score alongside the area under the receiver operating characteristic (ROC) curve. Our models achieved an F1 score of 0.85 and a ROC AUC of 0.93, indicating high accuracy in detecting fraudulent transactions without excessive false positives. This study contributes to the academic discourse on financial fraud detection and provides a practical framework for banking institutions seeking to implement real-time anomaly detection systems. By demonstrating the effectiveness of unsupervised learning techniques in a real-world context, our research offers a pathway to significantly reduce the incidence of financial fraud, thereby enhancing the security and trustworthiness of digital financial services.Keywords: anomaly detection, financial fraud, machine learning, autoencoders, isolation forest, transactional data analysis
Procedia PDF Downloads 595390 Detection and Distribution Pattern of Prevelant Genotypes of Hepatitis C in a Tertiary Care Hospital of Western India
Authors: Upasana Bhumbla
Abstract:
Background: Hepatitis C virus is a major cause of chronic hepatitis, which can further lead to cirrhosis of the liver and hepatocellular carcinoma. Worldwide the burden of Hepatitis C infection has become a serious threat to the human race. Hepatitis C virus (HCV) has population-specific genotypes and provides valuable epidemiological and therapeutic information. Genotyping and assessment of viral load in HCV patients are important for planning the therapeutic strategies. The aim of the study is to study the changing trends of prevalence and genotypic distribution of hepatitis C virus in a tertiary care hospital in Western India. Methods: It is a retrospective study; blood samples were collected and tested for anti HCV antibodies by ELISA in Dept. of Microbiology. In seropositive Hepatitis C patients, quantification of HCV-RNA was done by real-time PCR and in HCV-RNA positive samples, genotyping was conducted. Results: A total of 114 patients who were seropositive for Anti HCV were recruited in the study, out of which 79 (69.29%) were HCV-RNA positive. Out of these positive samples, 54 were further subjected to genotype determination using real-time PCR. Genotype was not detected in 24 samples due to low viral load; 30 samples were positive for genotype. Conclusion: Knowledge of genotype is crucial for the management of HCV infection and prediction of prognosis. Patients infected with HCV genotype 1 and 4 will have to receive Interferon and Ribavirin for 48 weeks. Patients with these genotypes show a poor sustained viral response when tested 24 weeks after completion of therapy. On the contrary, patients infected with HCV genotype 2 and 3 are reported to have a better response to therapy.Keywords: hepatocellular, genotype, ribavarin, seropositive
Procedia PDF Downloads 1275389 The Relationship between Political Risks and Capital Adequacy Ratio: Evidence from GCC Countries Using a Dynamic Panel Data Model (System–GMM)
Authors: Wesam Hamed
Abstract:
This paper contributes to the existing literature by investigating the impact of political risks on the capital adequacy ratio in the banking sector of Gulf Cooperation Council (GCC) countries, which is the first attempt for this nexus to the best of our knowledge. The dynamic panel data model (System‐GMM) showed that political risks significantly decrease the capital adequacy ratio in the banking sector. For this purpose, we used political risks, bank-specific, profitability, and macroeconomic variables that are utilized from the data stream database for the period 2005-2017. The results also actively support the “too big to fail” hypothesis. Finally, the robustness results confirm the conclusions derived from the baseline System‐GMM model.Keywords: capital adequacy ratio, system GMM, GCC, political risks
Procedia PDF Downloads 1485388 A Conglomerate of Multiple Optical Character Recognition Table Detection and Extraction
Authors: Smita Pallavi, Raj Ratn Pranesh, Sumit Kumar
Abstract:
Information representation as tables is compact and concise method that eases searching, indexing, and storage requirements. Extracting and cloning tables from parsable documents is easier and widely used; however, industry still faces challenges in detecting and extracting tables from OCR (Optical Character Recognition) documents or images. This paper proposes an algorithm that detects and extracts multiple tables from OCR document. The algorithm uses a combination of image processing techniques, text recognition, and procedural coding to identify distinct tables in the same image and map the text to appropriate the corresponding cell in dataframe, which can be stored as comma-separated values, database, excel, and multiple other usable formats.Keywords: table extraction, optical character recognition, image processing, text extraction, morphological transformation
Procedia PDF Downloads 1455387 CyberSteer: Cyber-Human Approach for Safely Shaping Autonomous Robotic Behavior to Comply with Human Intention
Authors: Vinicius G. Goecks, Gregory M. Gremillion, William D. Nothwang
Abstract:
Modern approaches to train intelligent agents rely on prolonged training sessions, high amounts of input data, and multiple interactions with the environment. This restricts the application of these learning algorithms in robotics and real-world applications, in which there is low tolerance to inadequate actions, interactions are expensive, and real-time processing and action are required. This paper addresses this issue introducing CyberSteer, a novel approach to efficiently design intrinsic reward functions based on human intention to guide deep reinforcement learning agents with no environment-dependent rewards. CyberSteer uses non-expert human operators for initial demonstration of a given task or desired behavior. The trajectories collected are used to train a behavior cloning deep neural network that asynchronously runs in the background and suggests actions to the deep reinforcement learning module. An intrinsic reward is computed based on the similarity between actions suggested and taken by the deep reinforcement learning algorithm commanding the agent. This intrinsic reward can also be reshaped through additional human demonstration or critique. This approach removes the need for environment-dependent or hand-engineered rewards while still being able to safely shape the behavior of autonomous robotic agents, in this case, based on human intention. CyberSteer is tested in a high-fidelity unmanned aerial vehicle simulation environment, the Microsoft AirSim. The simulated aerial robot performs collision avoidance through a clustered forest environment using forward-looking depth sensing and roll, pitch, and yaw references angle commands to the flight controller. This approach shows that the behavior of robotic systems can be shaped in a reduced amount of time when guided by a non-expert human, who is only aware of the high-level goals of the task. Decreasing the amount of training time required and increasing safety during training maneuvers will allow for faster deployment of intelligent robotic agents in dynamic real-world applications.Keywords: human-robot interaction, intelligent robots, robot learning, semisupervised learning, unmanned aerial vehicles
Procedia PDF Downloads 2595386 Hybrid Data-Driven Drilling Rate of Penetration Optimization Scheme Guided by Geological Formation and Historical Data
Authors: Ammar Alali, Mahmoud Abughaban, William Contreras Otalvora
Abstract:
Optimizing the drilling process for cost and efficiency requires the optimization of the rate of penetration (ROP). ROP is the measurement of the speed at which the wellbore is created, in units of feet per hour. It is the primary indicator of measuring drilling efficiency. Maximization of the ROP can indicate fast and cost-efficient drilling operations; however, high ROPs may induce unintended events, which may lead to nonproductive time (NPT) and higher net costs. The proposed ROP optimization solution is a hybrid, data-driven system that aims to improve the drilling process, maximize the ROP, and minimize NPT. The system consists of two phases: (1) utilizing existing geological and drilling data to train the model prior, and (2) real-time adjustments of the controllable dynamic drilling parameters [weight on bit (WOB), rotary speed (RPM), and pump flow rate (GPM)] that direct influence on the ROP. During the first phase of the system, geological and historical drilling data are aggregated. After, the top-rated wells, as a function of high instance ROP, are distinguished. Those wells are filtered based on NPT incidents, and a cross-plot is generated for the controllable dynamic drilling parameters per ROP value. Subsequently, the parameter values (WOB, GPM, RPM) are calculated as a conditioned mean based on physical distance, following Inverse Distance Weighting (IDW) interpolation methodology. The first phase is concluded by producing a model of drilling best practices from the offset wells, prioritizing the optimum ROP value. This phase is performed before the commencing of drilling. Starting with the model produced in phase one, the second phase runs an automated drill-off test, delivering live adjustments in real-time. Those adjustments are made by directing the driller to deviate two of the controllable parameters (WOB and RPM) by a small percentage (0-5%), following the Constrained Random Search (CRS) methodology. These minor incremental variations will reveal new drilling conditions, not explored before through offset wells. The data is then consolidated into a heat-map, as a function of ROP. A more optimum ROP performance is identified through the heat-map and amended in the model. The validation process involved the selection of a planned well in an onshore oil field with hundreds of offset wells. The first phase model was built by utilizing the data points from the top-performing historical wells (20 wells). The model allows drillers to enhance decision-making by leveraging existing data and blending it with live data in real-time. An empirical relationship between controllable dynamic parameters and ROP was derived using Artificial Neural Networks (ANN). The adjustments resulted in improved ROP efficiency by over 20%, translating to at least 10% saving in drilling costs. The novelty of the proposed system lays is its ability to integrate historical data, calibrate based geological formations, and run real-time global optimization through CRS. Those factors position the system to work for any newly drilled well in a developing field event.Keywords: drilling optimization, geological formations, machine learning, rate of penetration
Procedia PDF Downloads 1335385 Evaluation of Real-Time Background Subtraction Technique for Moving Object Detection Using Fast-Independent Component Analysis
Authors: Naoum Abderrahmane, Boumehed Meriem, Alshaqaqi Belal
Abstract:
Background subtraction algorithm is a larger used technique for detecting moving objects in video surveillance to extract the foreground objects from a reference background image. There are many challenges to test a good background subtraction algorithm, like changes in illumination, dynamic background such as swinging leaves, rain, snow, and the changes in the background, for example, moving and stopping of vehicles. In this paper, we propose an efficient and accurate background subtraction method for moving object detection in video surveillance. The main idea is to use a developed fast-independent component analysis (ICA) algorithm to separate background, noise, and foreground masks from an image sequence in practical environments. The fast-ICA algorithm is adapted and adjusted with a matrix calculation and searching for an optimum non-quadratic function to be faster and more robust. Moreover, in order to estimate the de-mixing matrix and the denoising de-mixing matrix parameters, we propose to convert all images to YCrCb color space, where the luma component Y (brightness of the color) gives suitable results. The proposed technique has been verified on the publicly available datasets CD net 2012 and CD net 2014, and experimental results show that our algorithm can detect competently and accurately moving objects in challenging conditions compared to other methods in the literature in terms of quantitative and qualitative evaluations with real-time frame rate.Keywords: background subtraction, moving object detection, fast-ICA, de-mixing matrix
Procedia PDF Downloads 985384 Recognition of Cursive Arabic Handwritten Text Using Embedded Training Based on Hidden Markov Models (HMMs)
Authors: Rabi Mouhcine, Amrouch Mustapha, Mahani Zouhir, Mammass Driss
Abstract:
In this paper, we present a system for offline recognition cursive Arabic handwritten text based on Hidden Markov Models (HMMs). The system is analytical without explicit segmentation used embedded training to perform and enhance the character models. Extraction features preceded by baseline estimation are statistical and geometric to integrate both the peculiarities of the text and the pixel distribution characteristics in the word image. These features are modelled using hidden Markov models and trained by embedded training. The experiments on images of the benchmark IFN/ENIT database show that the proposed system improves recognition.Keywords: recognition, handwriting, Arabic text, HMMs, embedded training
Procedia PDF Downloads 3555383 The Use of Ultrasound as a Safe and Cost-Efficient Technique to Assess Visceral Fat in Children with Obesity
Authors: Bassma A. Abdel Haleem, Ehab K. Emam, George E. Yacoub, Ashraf M. Salem
Abstract:
Background: Obesity is an increasingly common problem in childhood. Childhood obesity is considered the main risk factor for the development of metabolic syndrome (MetS) (diabetes type 2, dyslipidemia, and hypertension). Recent studies estimated that among children with obesity 30-60% will develop MetS. Visceral fat thickness is a valuable predictor of the development of MetS. Computed tomography and dual-energy X-ray absorptiometry are the main techniques to assess visceral fat. However, they carry the risk of radiation exposure and are expensive procedures. Consequently, they are seldom used in the assessment of visceral fat in children. Some studies explored the potential of ultrasound as a substitute to assess visceral fat in the elderly and found promising results. Given the vulnerability of children to radiation exposure, we sought to evaluate ultrasound as a safer and more cost-efficient alternative for measuring visceral fat in obese children. Additionally, we assessed the correlation between visceral fat and obesity indicators such as insulin resistance. Methods: A cross-sectional study was conducted on 46 children with obesity (aged 6–16 years). Their visceral fat was evaluated by ultrasound. Subcutaneous fat thickness (SFT), i.e., the measurement from the skin-fat interface to the linea alba, and visceral fat thickness (VFT), i.e., the thickness from the linea alba to the aorta, were measured and correlated with anthropometric measures, fasting lipid profile, homeostatic model assessment for insulin resistance (HOMA-IR) and liver enzymes (ALT). Results: VFT assessed via ultrasound was found to strongly correlate with the BMI, HOMA-IR with AUC for VFT as a predictor of insulin resistance of 0.858 and cut off point of >2.98. VFT also correlates positively with serum triglycerides and serum ALT. VFT correlates negatively with HDL. Conclusions: Ultrasound, a safe and cost-efficient technique, could be a useful tool for measuring the abdominal fat thickness in children with obesity. Ultrasound-measured VFT could be an appropriate prognostic factor for insulin resistance, hypertriglyceridemia, and elevated liver enzymes in obese children.Keywords: metabolic syndrome, pediatric obesity, sonography, visceral fat
Procedia PDF Downloads 1215382 The Research on Diesel Bus Emissions in Ulaanbaatar City: Mongolia
Authors: Tsetsegmaa A., Bayarsuren B., Altantsetseg Ts.
Abstract:
To make the best decision on reducing harmful emissions from buses, we need to have a clear understanding of the current state of their actual emissions. The emissions from city buses running on high sulfur fuel, particularly particulate matter (PM) and nitrogen oxides (NOx) from the exhaust gases of conventional diesel engines, have been studied and measured with and without diesel particulate filter (DPF) in Ulaanbaatar city. The study was conducted by using the PEMS (Portable Emissions Measurement System) and gravimetric method in real traffic conditions. The obtained data were used to determine the actual emission rates and to evaluate the effectiveness of the selected particulate filters. Actual road and daily PM emissions from city buses were determined during the warm and cold seasons. A bus with an average daily mileage of 242 km was found to emit 166.155 g of PM into the city's atmosphere on average per day, with 141.3 g in summer and 175.8 g in winter. The actual PM of the city bus is 0.6866 g/km. The concentration of NOx in the exhaust gas averages 1410.94 ppm. The use of DPF reduced the exhaust gas opacity of 24 buses by an average of 97% and filtered a total of 340.4 kg of soot from these buses over a period of six months. Retrofitting an old conventional diesel engine with cassette-type silicon carbide (SiC) DPF, despite the laboriousness of cleaning, can significantly reduce particulate matter emissions. Innovation: First comprehensive road PM and NOx emission dataset and actual road emissions from public buses have been identified. PM and NOx mathematical model equations have been estimated as a function of the bus technical speed and engine revolution with and without DPF.Keywords: conventional diesel, silicon carbide, real-time onboard measurements, particulate matter, diesel retrofit, fuel sulphur
Procedia PDF Downloads 1665381 Phylogenetic Analysis of the Myxosporea Detected from Emaciated Olive Flounder (Paralichthys olivaceus) in Korea
Authors: Seung Min Kim, Lyu Jin Jun, Joon Bum Jeong
Abstract:
The Myxosporea to cause emaciation disease in the olive flounder (Paralichthys olivaceus) is a pathogen to cause severe losses in the aquafarming industry in Korea. The 3,362 bp of DNA nucleotide sequences of four myxosporean strains (EM-HM-12, EM-MA-13, EM-JJ-14, and EM-MS-15) detected by PCR method from olive flounder suffering from emaciation disease in Korea during 2012-2015 were sequenced and deposited in GenBank database (GenBank accession numbers: KU377574, KT321705, KU377575 and KU377573, respectively). The homologies of DNA nucleotide sequences of four strains were compared to each other and were more than 99.7% homologous between the four strains. All of the strains were identified as Parvicapsula petunia based on the results of phylogenetic analysis. The results in this study would be useful for the research of emaciation disease in olive flounder of Korea.Keywords: disease, emaciation, olive flounder, phylogenetic analysis
Procedia PDF Downloads 299