Search results for: statistical techniques
8920 Evaluation of Three Digital Graphical Methods of Baseflow Separation Techniques in the Tekeze Water Basin in Ethiopia
Authors: Alebachew Halefom, Navsal Kumar, Arunava Poddar
Abstract:
The purpose of this work is to specify the parameter values, the base flow index (BFI), and to rank the methods that should be used for base flow separation. Three different digital graphical approaches are chosen and used in this study for the purpose of comparison. The daily time series discharge data were collected from the site for a period of 30 years (1986 up to 2015) and were used to evaluate the algorithms. In order to separate the base flow and the surface runoff, daily recorded streamflow (m³/s) data were used to calibrate procedures and get parameter values for the basin. Additionally, the performance of the model was assessed by the use of the standard error (SE), the coefficient of determination (R²), and the flow duration curve (FDC) and baseflow indexes. The findings indicate that, in general, each strategy can be used worldwide to differentiate base flow; however, the Sliding Interval Method (SIM) performs significantly better than the other two techniques in this basin. The average base flow index was calculated to be 0.72 using the local minimum method, 0.76 using the fixed interval method, and 0.78 using the sliding interval method, respectively.Keywords: baseflow index, digital graphical methods, streamflow, Emba Madre Watershed
Procedia PDF Downloads 838919 Impact of Different Modulation Techniques on the Performance of Free-Space Optics
Authors: Naman Singla, Ajay Pal Singh Chauhan
Abstract:
As the demand for providing high bit rate and high bandwidth is increasing at a rapid rate so there is a need to see in this problem and finds a technology that provides high bit rate and also high bandwidth. One possible solution is by use of optical fiber. Optical fiber technology provides high bandwidth in THz. But the disadvantage of optical fiber is of high cost and not used everywhere because it is not possible to reach all the locations on the earth. Also high maintenance required for usage of optical fiber. It puts a lot of cost. Another technology which is almost similar to optical fiber is Free Space Optics (FSO) technology. FSO is the line of sight technology where modulated optical beam whether infrared or visible is used to transfer information from one point to another through the atmosphere which works as a channel. This paper concentrates on analyzing the performance of FSO in terms of bit error rate (BER) and quality factor (Q) using different modulation techniques like non return to zero on off keying (NRZ-OOK), differential phase shift keying (DPSK) and differential quadrature phase shift keying (DQPSK) using OptiSystem software. The findings of this paper show that FSO system based on DQPSK modulation technique performs better.Keywords: attenuation, bit rate, free space optics, link length
Procedia PDF Downloads 3488918 Regional Hydrological Extremes Frequency Analysis Based on Statistical and Hydrological Models
Authors: Hadush Kidane Meresa
Abstract:
The hydrological extremes frequency analysis is the foundation for the hydraulic engineering design, flood protection, drought management and water resources management and planning to utilize the available water resource to meet the desired objectives of different organizations and sectors in a country. This spatial variation of the statistical characteristics of the extreme flood and drought events are key practice for regional flood and drought analysis and mitigation management. For different hydro-climate of the regions, where the data set is short, scarcity, poor quality and insufficient, the regionalization methods are applied to transfer at-site data to a region. This study aims in regional high and low flow frequency analysis for Poland River Basins. Due to high frequent occurring of hydrological extremes in the region and rapid water resources development in this basin have caused serious concerns over the flood and drought magnitude and frequencies of the river in Poland. The magnitude and frequency result of high and low flows in the basin is needed for flood and drought planning, management and protection at present and future. Hydrological homogeneous high and low flow regions are formed by the cluster analysis of site characteristics, using the hierarchical and C- mean clustering and PCA method. Statistical tests for regional homogeneity are utilized, by Discordancy and Heterogeneity measure tests. In compliance with results of the tests, the region river basin has been divided into ten homogeneous regions. In this study, frequency analysis of high and low flows using AM for high flow and 7-day minimum low flow series is conducted using six statistical distributions. The use of L-moment and LL-moment method showed a homogeneous region over entire province with Generalized logistic (GLOG), Generalized extreme value (GEV), Pearson type III (P-III), Generalized Pareto (GPAR), Weibull (WEI) and Power (PR) distributions as the regional drought and flood frequency distributions. The 95% percentile and Flow duration curves of 1, 7, 10, 30 days have been plotted for 10 stations. However, the cluster analysis performed two regions in west and east of the province where L-moment and LL-moment method demonstrated the homogeneity of the regions and GLOG and Pearson Type III (PIII) distributions as regional frequency distributions for each region, respectively. The spatial variation and regional frequency distribution of flood and drought characteristics for 10 best catchment from the whole region was selected and beside the main variable (streamflow: high and low) we used variables which are more related to physiographic and drainage characteristics for identify and delineate homogeneous pools and to derive best regression models for ungauged sites. Those are mean annual rainfall, seasonal flow, average slope, NDVI, aspect, flow length, flow direction, maximum soil moisture, elevation, and drainage order. The regional high-flow or low-flow relationship among one streamflow characteristics with (AM or 7-day mean annual low flows) some basin characteristics is developed using Generalized Linear Mixed Model (GLMM) and Generalized Least Square (GLS) regression model, providing a simple and effective method for estimation of flood and drought of desired return periods for ungauged catchments.Keywords: flood , drought, frequency, magnitude, regionalization, stochastic, ungauged, Poland
Procedia PDF Downloads 6038917 Delineation of the Geoelectric and Geovelocity Parameters in the Basement Complex of Northwestern Nigeria
Authors: M. D. Dogara, G. C. Afuwai, O. O. Esther, A. M. Dawai
Abstract:
The geology of Northern Nigeria is under intense investigation particularly that of the northwest believed to be of the basement complex. The variability of the lithology is consistently inconsistent. Hence, the need for a close range study, it is, in view of the above that, two geophysical techniques, the vertical electrical sounding employing the Schlumberger array and seismic refraction methods, were used to delineate the geoelectric and geovelocity parameters of the basement complex of northwestern Nigeria. A total area of 400,000 m² was covered with sixty geoelectric stations established and sixty sets of seismic refraction data collected using the forward and reverse method. From the interpretation of the resistivity data, it is suggestive that the area is underlain by not more than five geoelectric layers of varying thicknesses and resistivities when a maximum half electrode spread of 100m was used. The result of the interpreted seismic data revealed two geovelocity layers, with velocities ranging between 478m/s to 1666m/s for the first layer and 1166m/s to 7141m/s for the second layer. The results of the two techniques, suggests that the area of study has an undulating bedrock topography with geoeletric and geovelocity layers composed of weathered rock materials.Keywords: basement complex, delineation, geoelectric, geovelocity, Nigeria
Procedia PDF Downloads 1528916 Lineup Optimization Model of Basketball Players Based on the Prediction of Recursive Neural Networks
Authors: Wang Yichen, Haruka Yamashita
Abstract:
In recent years, in the field of sports, decision making such as member in the game and strategy of the game based on then analysis of the accumulated sports data are widely attempted. In fact, in the NBA basketball league where the world's highest level players gather, to win the games, teams analyze the data using various statistical techniques. However, it is difficult to analyze the game data for each play such as the ball tracking or motion of the players in the game, because the situation of the game changes rapidly, and the structure of the data should be complicated. Therefore, it is considered that the analysis method for real time game play data is proposed. In this research, we propose an analytical model for "determining the optimal lineup composition" using the real time play data, which is considered to be difficult for all coaches. In this study, because replacing the entire lineup is too complicated, and the actual question for the replacement of players is "whether or not the lineup should be changed", and “whether or not Small Ball lineup is adopted”. Therefore, we propose an analytical model for the optimal player selection problem based on Small Ball lineups. In basketball, we can accumulate scoring data for each play, which indicates a player's contribution to the game, and the scoring data can be considered as a time series data. In order to compare the importance of players in different situations and lineups, we combine RNN (Recurrent Neural Network) model, which can analyze time series data, and NN (Neural Network) model, which can analyze the situation on the field, to build the prediction model of score. This model is capable to identify the current optimal lineup for different situations. In this research, we collected all the data of accumulated data of NBA from 2019-2020. Then we apply the method to the actual basketball play data to verify the reliability of the proposed model.Keywords: recurrent neural network, players lineup, basketball data, decision making model
Procedia PDF Downloads 1348915 The Effect of Addition of Dioctyl Terephthalate and Calcite on the Tensile Properties of Organoclay/Linear Low Density Polyethylene Nanocomposites
Authors: A. Gürses, Z. Eroğlu, E. Şahin, K. Güneş, Ç. Doğar
Abstract:
In recent years, polymer/clay nanocomposites have generated great interest in the polymer industry as a new type of composite material because of their superior properties, which includes high heat deflection temperature, gas barrier performance, dimensional stability, enhanced mechanical properties, optical clarity and flame retardancy when compared with the pure polymer or conventional composites. The investigation of change of the tensile properties of organoclay/linear low density polyethylene (LLDPE) nanocomposites with the use of Dioctyl terephthalate (DOTP) (as plasticizer) and calcite (as filler) has been aimed. The composites and organoclay synthesized were characterized using the techniques such as XRD, HRTEM and FTIR techniques. The spectroscopic results indicate that platelets of organoclay were well dispersed within the polymeric matrix. The tensile properties of the composites were compared considering the stress-strain curve drawn for each composite and pure polymer. It was observed that the composites prepared by adding the plasticizer at different ratios and a certain amount of calcite exhibited different tensile behaviors compared to pure polymer.Keywords: linear low density polyethylene, nanocomposite, organoclay, plasticizer
Procedia PDF Downloads 2948914 Optimization of Acid Treatments by Assessing Diversion Strategies in Carbonate and Sandstone Formations
Authors: Ragi Poyyara, Vijaya Patnana, Mohammed Alam
Abstract:
When acid is pumped into damaged reservoirs for damage removal/stimulation, distorted inflow of acid into the formation occurs caused by acid preferentially traveling into highly permeable regions over low permeable regions, or (in general) into the path of least resistance. This can lead to poor zonal coverage and hence warrants diversion to carry out an effective placement of acid. Diversion is desirably a reversible technique of temporarily reducing the permeability of high perm zones, thereby forcing the acid into lower perm zones. The uniqueness of each reservoir can pose several challenges to engineers attempting to devise optimum and effective diversion strategies. Diversion techniques include mechanical placement and/or chemical diversion of treatment fluids, further sub-classified into ball sealers, bridge plugs, packers, particulate diverters, viscous gels, crosslinked gels, relative permeability modifiers (RPMs), foams, and/or the use of placement techniques, such as coiled tubing (CT) and the maximum pressure difference and injection rate (MAPDIR) methodology. It is not always realized that the effectiveness of diverters greatly depends on reservoir properties, such as formation type, temperature, reservoir permeability, heterogeneity, and physical well characteristics (e.g., completion type, well deviation, length of treatment interval, multiple intervals, etc.). This paper reviews the mechanisms by which each variety of diverter functions and discusses the effect of various reservoir properties on the efficiency of diversion techniques. Guidelines are recommended to help enhance productivity from zones of interest by choosing the best methods of diversion while pumping an optimized amount of treatment fluid. The success of an overall acid treatment often depends on the effectiveness of the diverting agents.Keywords: diversion, reservoir, zonal coverage, carbonate, sandstone
Procedia PDF Downloads 4338913 Synthesis, Characterization, Antioxidant and Anti-inflammatory Studies of Modern Synthetic Tetra Phenyl Porphyrin Derivatives
Authors: Mian Gul Sayed, Rahim Shah, Fazal Mabood, Najeeb Ur Rahman, Maher Noor
Abstract:
Embarking on the frontier of molecular advancement, this study focuses on the synthesis and characterization of a distinct class of porphyrin derivatives—specifically, the 5, 10, 15, 20-tetrakis (3-bromopropoxyphenyl) porphyrins. Through meticulous synthetic methodologies, these derivatives are crafted, strategically incorporating bromopropoxyphenyl moieties at distinct positions within the porphyrin framework. This research aims to unravel the structural intricacies and explore the potential applications of these compounds through a detailed characterization utilizing advanced analytical techniques. 5, 10, 15, 20, tetrakis (4-hydroxyphenyl) porphyrin was synthesized by treating pyrrole and p- hydroxylbenzaldehyde. 5, 10, 15, 20, tetrakis-(4-hydroxyphenyl) was converted into 5, 10, 15, 20, tetrakis (4-bromoalkoxyphenyl) porphyrin. 5,10,15, 20-Tetrakis -(4-bromoalkoxyphenyl) porphyrin was treated with Isopropyl phenol, para-Aminophenol, hydroquinone, 2-Naphthol, 1-Naphthol and Hydroquinone and different derivatives of ether-linked were obtained. The synthesized compounds were analyzed using contemporary spectroscopic techniques like UV-Vis, NMR and Mass spectrometry. The synthesized compounds were also tested for their biological activities like antioxidants and anti-inflammatory.Keywords: tetraphenyl porphyrin, NMR, antioxidant, anti-inflammatory
Procedia PDF Downloads 238912 Design an Assessment Model of Research and Development Capabilities with the New Product Development Approach: A Case Study of Iran Khodro Company
Authors: Hamid Hanifi, Adel Azar, Alireza Booshehri
Abstract:
In order to know about the capability level of R & D units in automotive industry, it is essential that organizations always compare themselves with standard level and higher than themselves so that to be improved continuously. In this research, with respect to the importance of this issue, we have tried to present an assessment model for R & D capabilities having reviewed on new products development in automotive industry of Iran. Iran Khodro Company was selected for the case study. To this purpose, first, having a review on the literature, about 200 indicators effective in R & D capabilities and new products development were extracted. Then, of these numbers, 29 indicators which were more important were selected by industry and academia experts and the questionnaire was distributed among statistical population. Statistical population was consisted of 410 individuals in Iran Khodro Company. We used the 410 questionnaires for exploratory factor analysis and then used the data of 308 questionnaires from the same population randomly for confirmatory factor analysis. The results of exploratory factor analysis led to categorization of dimensions in 9 secondary dimensions. Naming the dimensions was done according to a literature review and the professors’ opinion. Using structural equation modeling and AMOS software, confirmatory factor analysis was conducted and ultimate model with 9 secondary dimensions was confirmed. Meanwhile, 9 secondary dimensions of this research are as follows: 1) Research and design capability, 2) Customer and market capability, 3) Technology capability, 4) Financial resources capability, 5) Organizational chart, 6) Intellectual capital capability, 7) NPD process capability, 8) Managerial capability and 9) Strategy capability.Keywords: research and development, new products development, structural equations, exploratory factor analysis, confirmatory factor analysis
Procedia PDF Downloads 3448911 Quality of Care of Medical Male Circumcisions: A Non-Negotiable for Right to Care
Authors: Nelson Igaba, C. Onaga, S. Hlongwane
Abstract:
Background: Medical Male Circumcision (MMC) is part of a comprehensive HIV prevention strategy. The quality of MMC done at Right To Care (RtC) sites is maintained by Continuous Quality Improvement (CQI) based on findings of assessments by internal and independent external assessors who evaluate such parameters as the quality of the surgical procedure, infection control, etc. There are 12 RtC MMC teams in Mpumalanga, two of which are headed by Medical Officers and 10 by Clinical Associates (Clin A). Objectives: To compare the quality (i) of care rendered at doctor headed sites (DHS) versus Clin A headed sites (CHS); (ii) of CQI assessments (external versus internal). Methodology: A retrospective review of data from RightMax™ (a novel RtC data management system) and CQI reports (external and internal) was done. CQI assessment scores of October 2015 and October 2016 were taken as the baseline and latest respectively. Four sites with 745-810 circumcisions per annum were purposively selected; the two DHS (group A) and two CHS (group B). Statistical analyses were conducted using R (2017 version). Results: There were no significant difference in latest CQI scores between the two groups (DHS and CHS) (Anova, F = 1.97, df = 1, P = 0.165); between internal and external CQI assessment scores (Anova, F = 2.251, df = 1, P = 0.139) or among the individual sites (Anova, F = 1.095, df = 2, P = 0.341). Of the total of 16 adverse events reported by the four sites in the 12 months reviewed (all were infections), there was no statistical evidence that the documented severity of the infection was different for DHS and CHS (Fisher’s exact test, p-value = 0.269). Conclusion: At RtC VMMC sites in Mpumalanga, internal and external/independent CQI assessments are comparable, and quality of care of VMMC is standardized with the performance of well-supervised clinical associates comparing well with those of medical officers.Keywords: adverse events, Right to Care, male medical circumcision, continuous quality improvement
Procedia PDF Downloads 1798910 Understanding the Classification of Rain Microstructure and Estimation of Z-R Relationship using a Micro Rain Radar in Tropical Region
Authors: Tomiwa, Akinyemi Clement
Abstract:
Tropical regions experience diverse and complex precipitation patterns, posing significant challenges for accurate rainfall estimation and forecasting. This study addresses the problem of effectively classifying tropical rain types and refining the Z-R (Reflectivity-Rain Rate) relationship to enhance rainfall estimation accuracy. Through a combination of remote sensing, meteorological analysis, and machine learning, the research aims to develop an advanced classification framework capable of distinguishing between different types of tropical rain based on their unique characteristics. This involves utilizing high-resolution satellite imagery, radar data, and atmospheric parameters to categorize precipitation events into distinct classes, providing a comprehensive understanding of tropical rain systems. Additionally, the study seeks to improve the Z-R relationship, a crucial aspect of rainfall estimation. One year of rainfall data was analyzed using a Micro Rain Radar (MRR) located at The Federal University of Technology Akure, Nigeria, measuring rainfall parameters from ground level to a height of 4.8 km with a vertical resolution of 0.16 km. Rain rates were classified into low (stratiform) and high (convective) based on various microstructural attributes such as rain rates, liquid water content, Drop Size Distribution (DSD), average fall speed of the drops, and radar reflectivity. By integrating diverse datasets and employing advanced statistical techniques, the study aims to enhance the precision of Z-R models, offering a more reliable means of estimating rainfall rates from radar reflectivity data. This refined Z-R relationship holds significant potential for improving our understanding of tropical rain systems and enhancing forecasting accuracy in regions prone to heavy precipitation.Keywords: remote sensing, precipitation, drop size distribution, micro rain radar
Procedia PDF Downloads 428909 An Advanced Approach to Detect and Enumerate Soil-Transmitted Helminth Ova from Wastewater
Authors: Vivek B. Ravindran, Aravind Surapaneni, Rebecca Traub, Sarvesh K. Soni, Andrew S. Ball
Abstract:
Parasitic diseases have a devastating, long-term impact on human health and welfare. More than two billion people are infected with soil-transmitted helminths (STHs), including the roundworms (Ascaris), hookworms (Necator and Ancylostoma) and whipworm (Trichuris) with majority occurring in the tropical and subtropical regions of the world. Despite its low prevalence in developed countries, the removal of STHs from wastewater remains crucial to allow the safe use of sludge or recycled water in agriculture. Conventional methods such as incubation and optical microscopy are cumbersome; consequently, the results drastically vary from person-to-person observing the ova (eggs) under microscope. Although PCR-based methods are an alternative to conventional techniques, it lacks the ability to distinguish between viable and non-viable helminth ova. As a result, wastewater treatment industries are in major need for radically new and innovative tools to detect and quantify STHs eggs with precision, accuracy and being cost-effective. In our study, we focus on the following novel and innovative techniques: -Recombinase polymerase amplification and Surface enhanced Raman spectroscopy (RPA-SERS) based detection of helminth ova. -Use of metal nanoparticles and their relative nanozyme activity. -Colorimetric detection, differentiation and enumeration of genera of helminth ova using hydrolytic enzymes (chitinase and lipase). -Propidium monoazide (PMA)-qPCR to detect viable helminth ova. -Modified assay to recover and enumerate helminth eggs from fresh raw sewage. -Transcriptome analysis of ascaris ova in fresh raw sewage. The aforementioned techniques have the potential to replace current conventional and molecular methods thereby producing a standard protocol for the determination and enumeration of helminth ova in sewage sludge.Keywords: colorimetry, helminth, PMA-QPCR, nanoparticles, RPA, viable
Procedia PDF Downloads 3008908 A Survey on Intelligent Connected-Vehicle Applications Based on Intercommunication Techniques in Smart Cities
Authors: B. Karabuluter, O. Karaduman
Abstract:
Connected-Vehicles consists of intelligent vehicles, each of which can communicate with each other. Smart Cities are the most prominent application area of intelligent vehicles that can communicate with each other. The most important goal that is desired to be realized in Smart Cities planned for facilitating people's lives is to make transportation more comfortable and safe with intelligent/autonomous/driverless vehicles communicating with each other. In order to ensure these, the city must have communication infrastructure in the first place, and the vehicles must have the features to communicate with this infrastructure and with each other. In this context, intelligent transport studies to solve all transportation and traffic problems in classical cities continue to increase rapidly. In this study, current connected-vehicle applications developed for smart cities are considered in terms of communication techniques, vehicular networking, IoT, urban transportation implementations, intelligent traffic management, road safety, self driving. Taxonomies and assessments performed in the work show the trend of studies in inter-vehicle communication systems in smart cities and they are contributing to by ensuring that the requirements in this area are revealed.Keywords: smart city, connected vehicles, infrastructures, VANET, wireless communication, intelligent traffic management
Procedia PDF Downloads 5278907 Savi Scout versus Wire-Guided Localization in Non-palpable Breast Lesions – Comparison of Breast Tissue Volume and Weight and Excision Safety Margin
Authors: Walid Ibrahim, Abdul Kasem, Sudeendra Doddi, Ilaria Giono, Tareq Sabagh, Muhammad Ammar, Nermin Osman
Abstract:
Background: wire-guided localization (WL) is the most widely used method for the localization of non-palpable breast lesions. SAVI SCOUT occult lesion localization (SSL) is a new technique in breast-conservative surgery. SSL has the potential benefit of improving radiology workflow as well as accurate localization. Purpose: The purpose of this study is to compare the breast tissue specimen volume and weight and margin excision between WL and SSL. Materials and methods: A single institution retrospective analysis of 377 female patients who underwent wide local breast excision with SAVI SCOUT and or wire-guided technique between 2018 and 2021 in a UK University teaching hospital. Breast department. Breast tissue specimen volume and weight, and margin excision have been evaluated in the three groups of different localization. Results: Three hundred and seventy-seven patients were studied. Of these, 261 had wire localization, 88 had SCOUT and 28 had dual localization techniques. Tumor size ranged from 1 to 75mm (Median 20mm). The pathology specimen weight ranged from 1 to 466gm (Median 46.8) and the volume ranged from 1.305 to 1560cm³ (Median 106.32 cm³). SCOUT localization was associated with a significantly low specimen weight than wire or the dual technique localization (Median 41gm vs 47.3gm and 47gm, p = 0.029). SCOUT was not associated with better specimen volume with a borderline significance in comparison to wire and combined techniques (Median 108cm³ vs 105cm³ and 105cm³, p = 0.047). There was a significant correlation between tumor size and pathology specimen weight in the three groups. SCOUT showed a better >2mm safety margin in comparison to the other 2 techniques (p = 0.031). Conclusion: Preoperative SCOUT localization is associated with better specimen weight and better specimen margin. SCOUT did not show any benefits in terms of specimen volume which may be due to difficulty in getting the accurate specimen volume due to the irregularity of the soft tissue specimen.Keywords: scout, wire, localization, breast
Procedia PDF Downloads 1128906 Bioproduction of Indirubin from Fermentation and Renewable Sugars Through Genomic and Metabolomic Engineering of a Bacterial Strain
Authors: Vijay H. Ingole, Efthimia Lioliou
Abstract:
Indirubin, a key bioactive component of traditional Chinese medicine, has gained increasing recognition for its potential in modern biomedical applications, particularly in pharmacology and therapeutics. The present work aimed to harness the potential by engineering an Escherichia coli strain capable of high-yield indirubin production. Through meticulous genetic engineering, we optimized the metabolic pathways in E. coli to enhance indirubin synthesis. Further, to explored the optimization of culture media and indirubin yield via batch and fed-batch fermentation techniques. By fine-tuning upstream process (USP) parameters, including nutrient composition, pH, temperature, and aeration, we established conditions that maximized both cell growth and indirubin production. Additionally, significant efforts were dedicated to refining downstream process (DSP) conditions for the extraction, purification, and quantification of indirubin. Utilizing advanced biochemical methods and analytical techniques such as UHPLC, we ensured the production of high purity indirubin. This approach not only improved the economic viability of indirubin bioproduction but also aligned with the principles of green production and sustainability.Keywords: indirubin, bacterial strain, fermentation, HPLC
Procedia PDF Downloads 308905 Bioinformatic Approaches in Population Genetics and Phylogenetic Studies
Authors: Masoud Sheidai
Abstract:
Biologists with a special field of population genetics and phylogeny have different research tasks such as populations’ genetic variability and divergence, species relatedness, the evolution of genetic and morphological characters, and identification of DNA SNPs with adaptive potential. To tackle these problems and reach a concise conclusion, they must use the proper and efficient statistical and bioinformatic methods as well as suitable genetic and morphological characteristics. In recent years application of different bioinformatic and statistical methods, which are based on various well-documented assumptions, are the proper analytical tools in the hands of researchers. The species delineation is usually carried out with the use of different clustering methods like K-means clustering based on proper distance measures according to the studied features of organisms. A well-defined species are assumed to be separated from the other taxa by molecular barcodes. The species relationships are studied by using molecular markers, which are analyzed by different analytical methods like multidimensional scaling (MDS) and principal coordinate analysis (PCoA). The species population structuring and genetic divergence are usually investigated by PCoA and PCA methods and a network diagram. These are based on bootstrapping of data. The Association of different genes and DNA sequences to ecological and geographical variables is determined by LFMM (Latent factor mixed model) and redundancy analysis (RDA), which are based on Bayesian and distance methods. Molecular and morphological differentiating characters in the studied species may be identified by linear discriminant analysis (DA) and discriminant analysis of principal components (DAPC). We shall illustrate these methods and related conclusions by giving examples from different edible and medicinal plant species.Keywords: GWAS analysis, K-Means clustering, LFMM, multidimensional scaling, redundancy analysis
Procedia PDF Downloads 1278904 Mobile Application Interventions in Positive Psychology: Current Status and Recommendations for Effective App Design
Authors: Gus Salazar, Jeremy Bekker, Lauren Linford, Jared Warren
Abstract:
Positive psychology practices allow for its principles to be applied to all people, regardless of their current level of functioning. To increase the dissemination of these practices, interventions are being adapted for use with digital technology, such as mobile apps. However, the research regarding positive psychology mobile app interventions is still in its infancy. In an effort to facilitate progress in this important area, we 1) conducted a qualitative review to summarize the current state of the positive psychology mobile app literature and 2) developed research-supported recommendations for positive psychology app development to maximize behavior change. In our literature review, we found that while positive psychology apps varied widely in content and purpose, there was a near-complete lack of research supporting their effectiveness. Most apps provided no rationale for the behavioral change techniques (BCTs) they employed in their app, and most did not develop their app with specific theoretical frameworks or design models in mind. Given this problem, we recommended four steps for effective positive psychology app design. First, developers must establish their app in a research-supported theory of change. Second, researchers must select appropriate behavioral change techniques which are consistent with their app’s goals. Third, researchers must leverage effective design principles. These steps will help mobile applications use data-driven methods for encouraging behavior change in their users. Lastly, we discuss directions for future research. In particular, researchers must investigate the effectiveness of various BCTs in positive psychology interventions. Although there is some research on this point, we do not yet clearly understand the mechanisms within the apps that lead to behavior change. Additionally, app developers must also provide data on the effectiveness of their mobile apps. As developers follow these steps for effective app development and as researchers continue to investigate what makes these apps most effective, we will provide millions of people in need with access to research-based mental health resources.Keywords: behavioral change techniques, mobile app, mobile intervention, positive psychology
Procedia PDF Downloads 2278903 Modeling and Simulation of Textile Effluent Treatment Using Ultrafiltration Membrane Technology
Authors: Samia Rabet, Rachida Chemini, Gerhard Schäfer, Farid Aiouache
Abstract:
The textile industry generates large quantities of wastewater, which poses significant environmental problems due to its complex composition and high levels of pollutants loaded principally with heavy metals, large amounts of COD, and dye. Separation treatment methods are often known for their effectiveness in removing contaminants whereas membrane separation techniques are a promising process for the treatment of textile effluent due to their versatility, efficiency, and low energy requirements. This study focuses on the modeling and simulation of membrane separation technologies with a cross-flow filtration process for textile effluent treatment. It aims to explore the application of mathematical models and computational simulations using ASPEN Plus Software in the prediction of a complex and real effluent separation. The results demonstrate the effectiveness of modeling and simulation techniques in predicting pollutant removal efficiencies with a global deviation percentage of 1.83% between experimental and simulated results; membrane fouling behavior, and overall process performance (hydraulic resistance, membrane porosity) were also estimated and indicating that the membrane losses 10% of its efficiency after 40 min of working.Keywords: membrane separation, ultrafiltration, textile effluent, modeling, simulation
Procedia PDF Downloads 618902 Modelling the Behavior of Commercial and Test Textiles against Laundering Process by Statistical Assessment of Their Performance
Authors: M. H. Arslan, U. K. Sahin, H. Acikgoz-Tufan, I. Gocek, I. Erdem
Abstract:
Various exterior factors have perpetual effects on textile materials during wear, use and laundering in everyday life. In accordance with their frequency of use, textile materials are required to be laundered at certain intervals. The medium in which the laundering process takes place have inevitable detrimental physical and chemical effects on textile materials caused by the unique parameters of the process inherently existing. Connatural structures of various textile materials result in many different physical, chemical and mechanical characteristics. Because of their specific structures, these materials have different behaviors against several exterior factors. By modeling the behavior of commercial and test textiles as group-wise against laundering process, it is possible to disclose the relation in between these two groups of materials, which will lead to better understanding of their behaviors in terms of similarities and differences against the washing parameters of the laundering. Thus, the goal of the current research is to examine the behavior of two groups of textile materials as commercial textiles and as test textiles towards the main washing machine parameters during laundering process such as temperature, load quantity, mechanical action and level of water amount by concentrating on shrinkage, pilling, sewing defects, collar abrasion, the other defects other than sewing, whitening and overall properties of textiles. In this study, cotton fabrics were preferred as commercial textiles due to the fact that garments made of cotton are the most demanded products in the market by the textile consumers in daily life. Full factorial experimental set-up was used to design the experimental procedure. All profiles always including all of the commercial and the test textiles were laundered for 20 cycles by commercial home laundering machine to investigate the effects of the chosen parameters. For the laundering process, a modified version of ‘‘IEC 60456 Test Method’’ was utilized. The amount of detergent was altered as 0.5% gram per liter depending on varying load quantity levels. Datacolor 650®, EMPA Photographic Standards for Pilling Test and visual examination were utilized to test and characterize the textiles. Furthermore, in the current study the relation in between commercial and test textiles in terms of their performance was deeply investigated by the help of statistical analysis performed by MINITAB® package program modeling their behavior against the parameters of the laundering process. In the experimental work, the behaviors of both groups of textiles towards washing machine parameters were visually and quantitatively assessed in dry state.Keywords: behavior against washing machine parameters, performance evaluation of textiles, statistical analysis, commercial and test textiles
Procedia PDF Downloads 3628901 Principal Component Analysis Combined Machine Learning Techniques on Pharmaceutical Samples by Laser Induced Breakdown Spectroscopy
Authors: Kemal Efe Eseller, Göktuğ Yazici
Abstract:
Laser-induced breakdown spectroscopy (LIBS) is a rapid optical atomic emission spectroscopy which is used for material identification and analysis with the advantages of in-situ analysis, elimination of intensive sample preparation, and micro-destructive properties for the material to be tested. LIBS delivers short pulses of laser beams onto the material in order to create plasma by excitation of the material to a certain threshold. The plasma characteristics, which consist of wavelength value and intensity amplitude, depends on the material and the experiment’s environment. In the present work, medicine samples’ spectrum profiles were obtained via LIBS. Medicine samples’ datasets include two different concentrations for both paracetamol based medicines, namely Aferin and Parafon. The spectrum data of the samples were preprocessed via filling outliers based on quartiles, smoothing spectra to eliminate noise and normalizing both wavelength and intensity axis. Statistical information was obtained and principal component analysis (PCA) was incorporated to both the preprocessed and raw datasets. The machine learning models were set based on two different train-test splits, which were 70% training – 30% test and 80% training – 20% test. Cross-validation was preferred to protect the models against overfitting; thus the sample amount is small. The machine learning results of preprocessed and raw datasets were subjected to comparison for both splits. This is the first time that all supervised machine learning classification algorithms; consisting of Decision Trees, Discriminant, naïve Bayes, Support Vector Machines (SVM), k-NN(k-Nearest Neighbor) Ensemble Learning and Neural Network algorithms; were incorporated to LIBS data of paracetamol based pharmaceutical samples, and their different concentrations on preprocessed and raw dataset in order to observe the effect of preprocessing.Keywords: machine learning, laser-induced breakdown spectroscopy, medicines, principal component analysis, preprocessing
Procedia PDF Downloads 918900 Analysing Time Series for a Forecasting Model to the Dynamics of Aedes Aegypti Population Size
Authors: Flavia Cordeiro, Fabio Silva, Alvaro Eiras, Jose Luiz Acebal
Abstract:
Aedes aegypti is present in the tropical and subtropical regions of the world and is a vector of several diseases such as dengue fever, yellow fever, chikungunya, zika etc. The growth in the number of arboviruses cases in the last decades became a matter of great concern worldwide. Meteorological factors like mean temperature and precipitation are known to influence the infestation by the species through effects on physiology and ecology, altering the fecundity, mortality, lifespan, dispersion behaviour and abundance of the vector. Models able to describe the dynamics of the vector population size should then take into account the meteorological variables. The relationship between meteorological factors and the population dynamics of Ae. aegypti adult females are studied to provide a good set of predictors to model the dynamics of the mosquito population size. The time-series data of capture of adult females of a public health surveillance program from the city of Lavras, MG, Brazil had its association with precipitation, humidity and temperature analysed through a set of statistical methods for time series analysis commonly adopted in Signal Processing, Information Theory and Neuroscience. Cross-correlation, multicollinearity test and whitened cross-correlation were applied to determine in which time lags would occur the influence of meteorological variables on the dynamics of the mosquito abundance. Among the findings, the studied case indicated strong collinearity between humidity and precipitation, and precipitation was selected to form a pair of descriptors together with temperature. In the techniques used, there were observed significant associations between infestation indicators and both temperature and precipitation in short, mid and long terms, evincing that those variables should be considered in entomological models and as public health indicators. A descriptive model used to test the results exhibits a strong correlation to data.Keywords: Aedes aegypti, cross-correlation, multicollinearity, meteorological variables
Procedia PDF Downloads 1818899 Assessing the Impacts of Riparian Land Use on Gully Development and Sediment Load: A Case Study of Nzhelele River Valley, Limpopo Province, South Africa
Authors: B. Mavhuru, N. S. Nethengwe
Abstract:
Human activities on land degradation have triggered several environmental problems especially in rural areas that are underdeveloped. The main aim of this study is to analyze the contribution of different land uses to gully development and sediment load on the Nzhelele River Valley in the Limpopo Province. Data was collected using different methods such as observation, field data techniques and experiments. Satellite digital images, topographic maps, aerial photographs and the sediment load static model also assisted in determining how land use affects gully development and sediment load. For data analysis, the researcher used the following methods: Analysis of Variance (ANOVA), descriptive statistics, Pearson correlation coefficient and statistical correlation methods. The results of the research illustrate that high land use activities create negative changes especially in areas that are highly fragile and vulnerable. Distinct impact on land use change was observed within settlement area (9.6 %) within a period of 5 years. High correlation between soil organic matter and soil moisture (R=0.96) was observed. Furthermore, a significant variation (p ≤ 0.6) between the soil organic matter and soil moisture was also observed. A very significant variation (p ≤ 0.003) was observed in bulk density and extreme significant variations (p ≤ 0.0001) were observed in organic matter and soil particle size. The sand mining and agricultural activities has contributed significantly to the amount of sediment load in the Nzhelele River. A high significant amount of total suspended sediment (55.3 %) and bed load (53.8 %) was observed within the agricultural area. The connection which associates the development of gullies to various land use activities determines the amount of sediment load. These results are consistent with other previous research and suggest that land use activities are likely to exacerbate the development of gullies and sediment load in the Nzhelele River Valley.Keywords: drainage basin, geomorphological processes, gully development, land degradation, riparian land use and sediment load
Procedia PDF Downloads 3108898 Aqueous Extract of Argemone Mexicana Roots for Effective Corrosion Inhibition of Mild Steel in HCl Environment
Authors: Gopal Ji, Priyanka Dwivedi, Shanthi Sundaram, Rajiv Prakash
Abstract:
Inhibition effect of aqueous Argemone Mexicana root extract (AMRE) on mild steel corrosion in 1 M HCl has been studied by weight loss, Tafel polarization curves, electrochemical impedance spectroscopy (EIS), scanning electron microscopy (SEM) and atomic force microscopy (AFM) techniques. Results indicate that inhibition ability of AMRE increases with the increasing amount of the extract. A maximum corrosion inhibition of 94% is acknowledged at the extract concentration of 400 mg L-1. Polarization curves and impedance spectra reveal that both cathodic and anodic reactions are suppressed due to passive layer formation at metal-acid interface. It is also confirmed by SEM micro graphs and FTIR studies. Furthermore, the effects of acid concentration (1-5 M), immersion time (120 hours) and temperature (30-60˚C) on inhibition potential of AMRE have been investigated by weight loss method and electrochemical techniques. Adsorption mechanism is also proposed on the basis of weight loss results, which shows good agreement with Langmuir isotherm.Keywords: mild steel, polarization, SEM, acid corrosion, EIS, green inhibition
Procedia PDF Downloads 4968897 The Per Capita Income, Energy production and Environmental Degradation: A Comprehensive Assessment of the existence of the Environmental Kuznets Curve Hypothesis in Bangladesh
Authors: Ashique Mahmud, MD. Ataul Gani Osmani, Shoria Sharmin
Abstract:
In the first quarter of the twenty-first century, the most substantial global concern is environmental contamination, and it has gained the prioritization of both the national and international community. Keeping in mind this crucial fact, this study conducted different statistical and econometrical methods to identify whether the gross national income of the country has a significant impact on electricity production from nonrenewable sources and different air pollutants like carbon dioxide, nitrous oxide, and methane emissions. Besides, the primary objective of this research was to analyze whether the environmental Kuznets curve hypothesis holds for the examined variables. After analyzing different statistical properties of the variables, this study came to the conclusion that the environmental Kuznets curve hypothesis holds for gross national income and carbon dioxide emission in Bangladesh in the short run as well as the long run. This study comes to this conclusion based on the findings of ordinary least square estimations, ARDL bound tests, short-run causality analysis, the Error Correction Model, and other pre-diagnostic and post-diagnostic tests that have been employed in the structural model. Moreover, this study wants to demonstrate that the outline of gross national income and carbon dioxide emissions is in its initial stage of development and will increase up to the optimal peak. The compositional effect will then force the emission to decrease, and the environmental quality will be restored in the long run.Keywords: environmental Kuznets curve hypothesis, carbon dioxide emission in Bangladesh, gross national income in Bangladesh, autoregressive distributed lag model, granger causality, error correction model
Procedia PDF Downloads 1518896 Effect in Animal Nutrition of Genetical Modified Plant(GM)
Authors: Abdullah Özbilgin, Oguzhan Kahraman, Mustafa Selçuk Alataş
Abstract:
Plant breeders have made and will continue to make important contributions toward meeting the need for more and better feed and food. The use of new techniques to modify the genetic makeup of plants to improve their properties has led to a new generation of crops, grains and their by-products for feed. Plant breeders have made and will continue to make important contributions toward meeting the need for more and better feed and food. The use of new techniques to modify the genetic makeup of plants to improve their properties has led to a new generation of crops, grains and their by-products for feed. The land area devoted to the cultivation of genetically modified (GM) plants has increased in recent years: in 2012 such plants were grown on over 170 million hectares globally, in 28 different countries, and are at resent used by 17.3 million farmers worldwide. The majority of GM plants are used as feed material for food-producing farm animals. Despite the facts that GM plants have been used as feed for years and a number of feeding studies have proved their safety for animals, they still give rise to emotional public discussion.Keywords: crops, genetical modified plant(GM), plant, safety
Procedia PDF Downloads 5668895 Early Recognition and Grading of Cataract Using a Combined Log Gabor/Discrete Wavelet Transform with ANN and SVM
Authors: Hadeer R. M. Tawfik, Rania A. K. Birry, Amani A. Saad
Abstract:
Eyes are considered to be the most sensitive and important organ for human being. Thus, any eye disorder will affect the patient in all aspects of life. Cataract is one of those eye disorders that lead to blindness if not treated correctly and quickly. This paper demonstrates a model for automatic detection, classification, and grading of cataracts based on image processing techniques and artificial intelligence. The proposed system is developed to ease the cataract diagnosis process for both ophthalmologists and patients. The wavelet transform combined with 2D Log Gabor Wavelet transform was used as feature extraction techniques for a dataset of 120 eye images followed by a classification process that classified the image set into three classes; normal, early, and advanced stage. A comparison between the two used classifiers, the support vector machine SVM and the artificial neural network ANN were done for the same dataset of 120 eye images. It was concluded that SVM gave better results than ANN. SVM success rate result was 96.8% accuracy where ANN success rate result was 92.3% accuracy.Keywords: cataract, classification, detection, feature extraction, grading, log-gabor, neural networks, support vector machines, wavelet
Procedia PDF Downloads 3368894 Theoretical Modelling of Molecular Mechanisms in Stimuli-Responsive Polymers
Authors: Catherine Vasnetsov, Victor Vasnetsov
Abstract:
Context: Thermo-responsive polymers are materials that undergo significant changes in their physical properties in response to temperature changes. These polymers have gained significant attention in research due to their potential applications in various industries and medicine. However, the molecular mechanisms underlying their behavior are not well understood, particularly in relation to cosolvency, which is crucial for practical applications. Research Aim: This study aimed to theoretically investigate the phenomenon of cosolvency in long-chain polymers using the Flory-Huggins statistical-mechanical framework. The main objective was to understand the interactions between the polymer, solvent, and cosolvent under different conditions. Methodology: The research employed a combination of Monte Carlo computer simulations and advanced machine-learning methods. The Flory-Huggins mean field theory was used as the basis for the simulations. Spinodal graphs and ternary plots were utilized to develop an initial computer model for predicting polymer behavior. Molecular dynamic simulations were conducted to mimic real-life polymer systems. Machine learning techniques were incorporated to enhance the accuracy and reliability of the simulations. Findings: The simulations revealed that the addition of very low or very high volumes of cosolvent molecules resulted in smaller radii of gyration for the polymer, indicating poor miscibility. However, intermediate volume fractions of cosolvent led to higher radii of gyration, suggesting improved miscibility. These findings provide a possible microscopic explanation for the cosolvency phenomenon in polymer systems. Theoretical Importance: This research contributes to a better understanding of the behavior of thermo-responsive polymers and the role of cosolvency. The findings provide insights into the molecular mechanisms underlying cosolvency and offer specific predictions for future experimental investigations. The study also presents a more rigorous analysis of the Flory-Huggins free energy theory in the context of polymer systems. Data Collection and Analysis Procedures: The data for this study was collected through Monte Carlo computer simulations and molecular dynamic simulations. The interactions between the polymer, solvent, and cosolvent were analyzed using the Flory-Huggins mean field theory. Machine learning techniques were employed to enhance the accuracy of the simulations. The collected data was then analyzed to determine the impact of cosolvent volume fractions on the radii of gyration of the polymer. Question Addressed: The research addressed the question of how cosolvency affects the behavior of long-chain polymers. Specifically, the study aimed to investigate the interactions between the polymer, solvent, and cosolvent under different volume fractions and understand the resulting changes in the radii of gyration. Conclusion: In conclusion, this study utilized theoretical modeling and computer simulations to investigate the phenomenon of cosolvency in long-chain polymers. The findings suggest that moderate cosolvent volume fractions can lead to improved miscibility, as indicated by higher radii of gyration. These insights contribute to a better understanding of the molecular mechanisms underlying cosolvency in polymer systems and provide predictions for future experimental studies. The research also enhances the theoretical analysis of the Flory-Huggins free energy theory.Keywords: molecular modelling, flory-huggins, cosolvency, stimuli-responsive polymers
Procedia PDF Downloads 718893 Hyperspectral Mapping Methods for Differentiating Mangrove Species along Karachi Coast
Authors: Sher Muhammad, Mirza Muhammad Waqar
Abstract:
It is necessary to monitor and identify mangroves types and spatial extent near coastal areas because it plays an important role in coastal ecosystem and environmental protection. This research aims at identifying and mapping mangroves types along Karachi coast ranging from 24.79 to 24.85 degree in latitude and 66.91 to 66.97 degree in longitude using hyperspectral remote sensing data and techniques. Image acquired during February, 2012 through Hyperion sensor have been used for this research. Image preprocessing includes geometric and radiometric correction followed by Minimum Noise Fraction (MNF) and Pixel Purity Index (PPI). The output of MNF and PPI has been analyzed by visualizing it in n-dimensions for end-member extraction. Well-distributed clusters on the n-dimensional scatter plot have been selected with the region of interest (ROI) tool as end members. These end members have been used as an input for classification techniques applied to identify and map mangroves species including Spectral Angle Mapper (SAM), Spectral Feature Fitting (SFF), and Spectral Information Diversion (SID). Only two types of mangroves namely Avicennia Marina (white mangroves) and Avicennia Germinans (black mangroves) have been observed throughout the study area.Keywords: mangrove, hyperspectral, hyperion, SAM, SFF, SID
Procedia PDF Downloads 3638892 Prevalence and Risk Factors of Low Back Disorder among Waste Collection Workers: A Systematic Review
Authors: Benedicta Asante, Catherine Trask, Brenna Bath
Abstract:
Background: Waste Collection Workers’ (WCWs) activities contribute greatly to the recycling sector and are an important component of the waste management industry. As the recycling sector evolves, there is the increase in reports of injuries, particularly for common and debilitating musculoskeletal disorders such as low back disorder (LBD). WCWs are likely exposed to diverse work-related hazards that could contribute to LBD. However, there is currently no summary of the state of knowledge on the prevalence and risk factors of LBD within this workforce. Method: A comprehensive search was conducted in Ovid Medline, EMBASE, and Global Health e-publications with search term categories ‘low back disorder’ and ‘waste collection workers’. Two reviewers screened articles at title, abstract, and full-text stages. Data were extracted on study design, sampling strategy, socio-demographics, geographical region, and exposure definition, the definition of LBD, response rate, statistical techniques, LBD prevalence and risk factors. The risk of bias was assessed with a standardized tool. Results: The search of three databases generated 79 studies. Thirty-two studies met the study inclusion criteria for both title and abstract; only thirteen full-text articles met the study criteria and underwent data extraction. The majority of articles reported a 12-month prevalence of LBD between 16-74%. Although none of the included studies quantified relationships between risk factors and LBD, the suggested risk factors for LBD among WCWs included: awkward posture; lifting; pulling; pushing; repetitive motions; work duration; and physical loads. Conclusion: LBD is a major occupational health issue among WCWs. In light of these risks and future growth in this industry, further research should focus on the investigation of risk factors, with more focus on ergonomic exposure assessment, and LBD prevention efforts.Keywords: low back pain, scavenger, waste pickers, waste collection workers
Procedia PDF Downloads 2568891 Concept for Knowledge out of Sri Lankan Non-State Sector: Performances of Higher Educational Institutes and Successes of Its Sector
Authors: S. Jeyarajan
Abstract:
Concept of knowledge is discovered from conducted study for successive Competition in Sri Lankan Non-State Higher Educational Institutes. The Concept discovered out of collected Knowledge Management Practices from Emerald inside likewise reputed literatures and of Non-State Higher Educational sector. A test is conducted to reveal existences and its reason behind of these collected practices in Sri Lankan Non-State Higher Education Institutes. Further, unavailability of such study and uncertain on number of participants for data collection in the Sri Lankan context contributed selection of research method as qualitative method, which used attributes of Delphi Method to manage those likewise uncertainty. Data are collected under Dramaturgical Method, which contributes efficient usage of the Delphi method. Grounded theory is selected as data analysis techniques, which is conducted in intermixed discourse to manage different perspectives of data that are collected systematically through perspective and modified snowball sampling techniques. Data are then analysed using Grounded Theory Development Techniques in Intermix discourses to manage differences in Data. Consequently, Agreement in the results of Grounded theories and of finding in the Foreign Study is discovered in the analysis whereas present study conducted as Qualitative Research and The Foreign Study conducted as Quantitative Research. As such, the Present study widens the discovery in the Foreign Study. Further, having discovered reason behind of the existences, the Present result shows Concept for Knowledge from Sri Lankan Non-State sector to manage higher educational Institutes in successful manner.Keywords: adherence of snowball sampling into perspective sampling, Delphi method in qualitative method, grounded theory development in intermix discourses of analysis, knowledge management for success of higher educational institutes
Procedia PDF Downloads 175