Search results for: estimation after selection
2864 Service Life Prediction of Tunnel Structures Subjected to Water Seepage
Authors: Hassan Baji, Chun-Qing Li, Wei Yang
Abstract:
Water seepage is one of the most common causes of damage in tunnel structures, which can cause direct and indirect e.g. reinforcement corrosion and calcium leaching damages. Estimation of water seepage or inflow is one of the main challenges in probabilistic assessment of tunnels. The methodology proposed in this study is an attempt for mathematically modeling the water seepage in tunnel structures and further predicting its service life. Using the time-dependent reliability, water seepage is formulated as a failure mode, which can be used for prediction of service life. Application of the formulated seepage failure mode to a case study tunnel is presented.Keywords: water seepage, tunnels, time-dependent reliability, service life
Procedia PDF Downloads 4822863 Selection Effects on the Molecular and Abiotic Evolution of Antibiotic Resistance
Authors: Abishek Rajkumar
Abstract:
Antibiotic resistance can occur naturally given the selective pressure placed on antibiotics. Within a large population of bacteria, there is a significant chance that some of those bacteria can develop resistance via mutations or genetic recombination. However, a growing public health concern has arisen over the fact that antibiotic resistance has increased significantly over the past few decades. This is because humans have been over-consuming and producing antibiotics, which has ultimately accelerated the antibiotic resistance seen in these bacteria. The product of all of this is an ongoing race between scientists and the bacteria as bacteria continue to develop resistance, which creates even more demand for an antibiotic that can still terminate the newly resistant strain of bacteria. This paper will focus on a myriad of aspects of antibiotic resistance in bacteria starting with how it occurs on a molecular level and then focusing on the antibiotic concentrations and how they affect the resistance and fitness seen in bacteria.Keywords: antibiotic, molecular, mutation, resistance
Procedia PDF Downloads 3232862 An Application-Driven Procedure for Optimal Signal Digitization of Automotive-Grade Ultrasonic Sensors
Authors: Mohamed Shawki Elamir, Heinrich Gotzig, Raoul Zoellner, Patrick Maeder
Abstract:
In this work, a methodology is presented for identifying the optimal digitization parameters for the analog signal of ultrasonic sensors. These digitization parameters are the resolution of the analog to digital conversion and the sampling rate. This is accomplished through the derivation of characteristic curves based on Fano inequality and the calculation of the mutual information content over a given dataset. The mutual information is calculated between the examples in the dataset and the corresponding variation in the feature that needs to be estimated. The optimal parameters are identified in a manner that ensures optimal estimation performance while preventing inefficiency in using unnecessarily powerful analog to digital converters.Keywords: analog to digital conversion, digitization, sampling rate, ultrasonic
Procedia PDF Downloads 2072861 The Need for the Utilization of Instructional Materials on the Teaching and Learning of Agricultural Science Education in Developing Countries
Authors: Ogoh Andrew Enokela
Abstract:
This paper dwelt on the need for the utilization of instructional materials with highlights on the type of instructional materials, selection, uses and their importance on the learning and teaching of Agricultural Science Education in developing countries. It further discussed the concept of improvisation with some recommendation in terms of availability, utilization on the teaching and learning of Agricultural Science Education.Keywords: instructional materials, agricultural science education, improvisation, teaching and learning
Procedia PDF Downloads 3232860 A Fuzzy Nonlinear Regression Model for Interval Type-2 Fuzzy Sets
Authors: O. Poleshchuk, E. Komarov
Abstract:
This paper presents a regression model for interval type-2 fuzzy sets based on the least squares estimation technique. Unknown coefficients are assumed to be triangular fuzzy numbers. The basic idea is to determine aggregation intervals for type-1 fuzzy sets, membership functions of whose are low membership function and upper membership function of interval type-2 fuzzy set. These aggregation intervals were called weighted intervals. Low and upper membership functions of input and output interval type-2 fuzzy sets for developed regression models are considered as piecewise linear functions.Keywords: interval type-2 fuzzy sets, fuzzy regression, weighted interval
Procedia PDF Downloads 3732859 Block Matching Based Stereo Correspondence for Depth Calculation
Authors: G. Balakrishnan
Abstract:
Stereo Correspondence plays a major role in estimation of distance of an object from the stereo camera pair for various applications. In this paper, a stereo correspondence algorithm based on block-matching technique is presented. Initially, an energy matrix is calculated for every disparity obtained using modified Sum of Absolute Difference (SAD). Higher energy matrix errors are removed by using threshold value in order to reduce the mismatch errors. A smoothening filter is applied to eliminate unreliable disparity estimate across the object boundaries. The purpose is to improve the reliability of calculation of disparity map. The experimental results obtained shows that the final depth map produce better results and can be used to all the applications using stereo cameras.Keywords: stereo matching, filters, energy matrix, disparity
Procedia PDF Downloads 2152858 Design and Construction of Temperature and Humidity Control Channel for a Bacteriological Incubator
Authors: Carlos R. Duharte Rodríguez, Ibrain Ceballo Acosta, Carmen B. Busoch Morlán, Angel Regueiro Gómez, Annet Martinez Hernández
Abstract:
This work shows the designing and characterization of a prototype of laboratory incubator as support of research in Microbiology, in particular during studies of bacterial growth in biological samples, with the help of optic methods (Turbidimetry) and electrometric measurements of bioimpedance. It shows the results of simulation and experimentation of the design proposed for the canals of measurement of the variables: temperature and humidity, with a high linearity from the adequate selection of sensors and analogue components of every channel, controlled with help of a microcontroller AT89C51 (ATMEL) with adequate benefits for this type of application.Keywords: microbiology, bacterial growth, incubation station, microorganisms
Procedia PDF Downloads 4012857 Probability Modeling and Genetic Algorithms in Small Wind Turbine Design Optimization: Mentored Interdisciplinary Undergraduate Research at LaGuardia Community College
Authors: Marina Nechayeva, Malgorzata Marciniak, Vladimir Przhebelskiy, A. Dragutan, S. Lamichhane, S. Oikawa
Abstract:
This presentation is a progress report on a faculty-student research collaboration at CUNY LaGuardia Community College (LaGCC) aimed at designing a small horizontal axis wind turbine optimized for the wind patterns on the roof of our campus. Our project combines statistical and engineering research. Our wind modeling protocol is based upon a recent wind study by a faculty-student research group at MIT, and some of our blade design methods are adopted from a senior engineering project at CUNY City College. Our use of genetic algorithms has been inspired by the work on small wind turbines’ design by David Wood. We combine these diverse approaches in our interdisciplinary project in a way that has not been done before and improve upon certain techniques used by our predecessors. We employ several estimation methods to determine the best fitting parametric probability distribution model for the local wind speed data obtained through correlating short-term on-site measurements with a long-term time series at the nearby airport. The model serves as a foundation for engineering research that focuses on adapting and implementing genetic algorithms (GAs) to engineering optimization of the wind turbine design using Blade Element Momentum Theory. GAs are used to create new airfoils with desirable aerodynamic specifications. Small scale models of best performing designs are 3D printed and tested in the wind tunnel to verify the accuracy of relevant calculations. Genetic algorithms are applied to selected airfoils to determine the blade design (radial cord and pitch distribution) that would optimize the coefficient of power profile of the turbine. Our approach improves upon the traditional blade design methods in that it lets us dispense with assumptions necessary to simplify the system of Blade Element Momentum Theory equations, thus resulting in more accurate aerodynamic performance calculations. Furthermore, it enables us to design blades optimized for a whole range of wind speeds rather than a single value. Lastly, we improve upon known GA-based methods in that our algorithms are constructed to work with XFoil generated airfoils data which enables us to optimize blades using our own high glide ratio airfoil designs, without having to rely upon available empirical data from existing airfoils, such as NACA series. Beyond its immediate goal, this ongoing project serves as a training and selection platform for CUNY Research Scholars Program (CRSP) through its annual Aerodynamics and Wind Energy Research Seminar (AWERS), an undergraduate summer research boot camp, designed to introduce prospective researchers to the relevant theoretical background and methodology, get them up to speed with the current state of our research, and test their abilities and commitment to the program. Furthermore, several aspects of the research (e.g., writing code for 3D printing of airfoils) are adapted in the form of classroom research activities to enhance Calculus sequence instruction at LaGCC.Keywords: engineering design optimization, genetic algorithms, horizontal axis wind turbine, wind modeling
Procedia PDF Downloads 2312856 Estimation of Sediment Transport into a Reservoir Dam
Authors: Kiyoumars Roushangar, Saeid Sadaghian
Abstract:
Although accurate sediment load prediction is very important in planning, designing, operating and maintenance of water resources structures, the transport mechanism is complex, and the deterministic transport models are based on simplifying assumptions often lead to large prediction errors. In this research, firstly, two intelligent ANN methods, Radial Basis and General Regression Neural Networks, are adopted to model of total sediment load transport into Madani Dam reservoir (north of Iran) using the measured data and then applicability of the sediment transport methods developed by Engelund and Hansen, Ackers and White, Yang, and Toffaleti for predicting of sediment load discharge are evaluated. Based on comparison of the results, it is found that the GRNN model gives better estimates than the sediment rating curve and mentioned classic methods.Keywords: sediment transport, dam reservoir, RBF, GRNN, prediction
Procedia PDF Downloads 4962855 Research on the Application of Flexible and Programmable Systems in Electronic Systems
Authors: Yang Xiaodong
Abstract:
This article explores the application and structural characteristics of flexible and programmable systems in electronic systems, with a focus on analyzing their advantages and architectural differences in dealing with complex environments. By introducing mathematical models and simulation experiments, the performance of dynamic module combination in flexible systems and fixed path selection in programmable systems in resource utilization and performance optimization was demonstrated. This article also discusses the mutual transformation between the two in practical applications and proposes a solution to improve system flexibility and performance through dynamic reconfiguration technology. This study provides theoretical reference for the design and optimization of flexible and programmable systems.Keywords: flexibility, programmable, electronic systems, system architecture
Procedia PDF Downloads 92854 Identifying Protein-Coding and Non-Coding Regions in Transcriptomes
Authors: Angela U. Makolo
Abstract:
Protein-coding and Non-coding regions determine the biology of a sequenced transcriptome. Research advances have shown that Non-coding regions are important in disease progression and clinical diagnosis. Existing bioinformatics tools have been targeted towards Protein-coding regions alone. Therefore, there are challenges associated with gaining biological insights from transcriptome sequence data. These tools are also limited to computationally intensive sequence alignment, which is inadequate and less accurate to identify both Protein-coding and Non-coding regions. Alignment-free techniques can overcome the limitation of identifying both regions. Therefore, this study was designed to develop an efficient sequence alignment-free model for identifying both Protein-coding and Non-coding regions in sequenced transcriptomes. Feature grouping and randomization procedures were applied to the input transcriptomes (37,503 data points). Successive iterations were carried out to compute the gradient vector that converged the developed Protein-coding and Non-coding Region Identifier (PNRI) model to the approximate coefficient vector. The logistic regression algorithm was used with a sigmoid activation function. A parameter vector was estimated for every sample in 37,503 data points in a bid to reduce the generalization error and cost. Maximum Likelihood Estimation (MLE) was used for parameter estimation by taking the log-likelihood of six features and combining them into a summation function. Dynamic thresholding was used to classify the Protein-coding and Non-coding regions, and the Receiver Operating Characteristic (ROC) curve was determined. The generalization performance of PNRI was determined in terms of F1 score, accuracy, sensitivity, and specificity. The average generalization performance of PNRI was determined using a benchmark of multi-species organisms. The generalization error for identifying Protein-coding and Non-coding regions decreased from 0.514 to 0.508 and to 0.378, respectively, after three iterations. The cost (difference between the predicted and the actual outcome) also decreased from 1.446 to 0.842 and to 0.718, respectively, for the first, second and third iterations. The iterations terminated at the 390th epoch, having an error of 0.036 and a cost of 0.316. The computed elements of the parameter vector that maximized the objective function were 0.043, 0.519, 0.715, 0.878, 1.157, and 2.575. The PNRI gave an ROC of 0.97, indicating an improved predictive ability. The PNRI identified both Protein-coding and Non-coding regions with an F1 score of 0.970, accuracy (0.969), sensitivity (0.966), and specificity of 0.973. Using 13 non-human multi-species model organisms, the average generalization performance of the traditional method was 74.4%, while that of the developed model was 85.2%, thereby making the developed model better in the identification of Protein-coding and Non-coding regions in transcriptomes. The developed Protein-coding and Non-coding region identifier model efficiently identified the Protein-coding and Non-coding transcriptomic regions. It could be used in genome annotation and in the analysis of transcriptomes.Keywords: sequence alignment-free model, dynamic thresholding classification, input randomization, genome annotation
Procedia PDF Downloads 682853 Ontology-Based Systemizing of the Science Information Devoted to Waste Utilizing by Methanogenesis
Authors: Ye. Shapovalov, V. Shapovalov, O. Stryzhak, A. Salyuk
Abstract:
Over the past decades, amount of scientific information has been growing exponentially. It became more complicated to process and systemize this amount of data. The approach to systematization of scientific information on the production of biogas based on the ontological IT platform “T.O.D.O.S.” has been developed. It has been proposed to select semantic characteristics of each work for their further introduction into the IT platform “T.O.D.O.S.”. An ontological graph with a ranking function for previous scientific research and for a system of selection of microorganisms has been worked out. These systems provide high performance of information management of scientific information.Keywords: ontology-based analysis, analysis of scientific data, methanogenesis, microorganism hierarchy, 'T.O.D.O.S.'
Procedia PDF Downloads 1642852 Case-Based Reasoning Approach for Process Planning of Internal Thread Cold Extrusion
Authors: D. Zhang, H. Y. Du, G. W. Li, J. Zeng, D. W. Zuo, Y. P. You
Abstract:
For the difficult issues of process selection, case-based reasoning technology is applied to computer aided process planning system for cold form tapping of internal threads on the basis of similarity in the process. A model is established based on the analysis of process planning. Case representation and similarity computing method are given. Confidence degree is used to evaluate the case. Rule-based reuse strategy is presented. The scheme is illustrated and verified by practical application. The case shows the design results with the proposed method are effective.Keywords: case-based reasoning, internal thread, cold extrusion, process planning
Procedia PDF Downloads 5102851 A Failure Investigations of High-Temperature Hydrogen Attack at Plat Forming Unit Furnace Elbow
Authors: Altoumi Alndalusi
Abstract:
High-temperature hydrogen attack (HTHA) failure is the common phenomena at elevated temperature in hydrogen environment in oil and gas field. The failure occurred once after four years at the internal surface of Platforming elbow. Both visual and microscopic examinations revealed that the failure was initiated due to blistering forming followed by large cracking at the inner surface. Crack morphology showed that the crack depth was about 50% of material wall thickness and its behavior generally was intergranular. This study concluded that the main reason led to failure due to incorrect material selection comparing to the platforming conditions.Keywords: decarburization, failure, heat affected zone, morphology, partial pressure, plate form
Procedia PDF Downloads 1562850 Statistical Estimation of Ionospheric Energy Dissipation Using ØStgaard's Empirical Relation
Authors: M. A. Ahmadu, S. S. Rabia
Abstract:
During the past few decades, energy dissipation in the ionosphere resulting from the geomagnetic activity has caused an increasing number of major disruptions of important power and communication services, malfunctions and loss of expensive facilities. Here, the electron precipitation energy, w(ep) and joule heating energy, w(jh) was used in the computation of this dissipation using Østgaard’s empirical relation from hourly geomagnetic indices of 2012, under the assumption that the magnetosphere does not store any energy, so that at the beginning of the activity t1=0 and end at t2=t, the statistical results obtained show that ionospheric dissipation varies month to month, day to day and hour to hour and estimated with a value ~3.6 w(ep), which is in agreement with experimental result.Keywords: Ostgaard's, ionospheric dissipation, joule heating, electron precipitation, geomagnetic indices, empirical relation
Procedia PDF Downloads 2932849 Application of Deep Learning and Ensemble Methods for Biomarker Discovery in Diabetic Nephropathy through Fibrosis and Propionate Metabolism Pathways
Authors: Oluwafunmibi Omotayo Fasanya, Augustine Kena Adjei
Abstract:
Diabetic nephropathy (DN) is a major complication of diabetes, with fibrosis and propionate metabolism playing critical roles in its progression. Identifying biomarkers linked to these pathways may provide novel insights into DN diagnosis and treatment. This study aims to identify biomarkers associated with fibrosis and propionate metabolism in DN. Analyze the biological pathways and regulatory mechanisms of these biomarkers. Develop a machine learning model to predict DN-related biomarkers and validate their functional roles. Publicly available transcriptome datasets related to DN (GSE96804 and GSE104948) were obtained from the GEO database (https://www.ncbi.nlm.nih.gov/gds), and 924 propionate metabolism-related genes (PMRGs) and 656 fibrosis-related genes (FRGs) were identified. The analysis began with the extraction of DN-differentially expressed genes (DN-DEGs) and propionate metabolism-related DEGs (PM-DEGs), followed by the intersection of these with fibrosis-related genes to identify key intersected genes. Instead of relying on traditional models, we employed a combination of deep neural networks (DNNs) and ensemble methods such as Gradient Boosting Machines (GBM) and XGBoost to enhance feature selection and biomarker discovery. Recursive feature elimination (RFE) was coupled with these advanced algorithms to refine the selection of the most critical biomarkers. Functional validation was conducted using convolutional neural networks (CNN) for gene set enrichment and immunoinfiltration analysis, revealing seven significant biomarkers—SLC37A4, ACOX2, GPD1, ACE2, SLC9A3, AGT, and PLG. These biomarkers are involved in critical biological processes such as fatty acid metabolism and glomerular development, providing a mechanistic link to DN progression. Furthermore, a TF–miRNA–mRNA regulatory network was constructed using natural language processing models to identify 8 transcription factors and 60 miRNAs that regulate these biomarkers, while a drug–gene interaction network revealed potential therapeutic targets such as UROKINASE–PLG and ATENOLOL–AGT. This integrative approach, leveraging deep learning and ensemble models, not only enhances the accuracy of biomarker discovery but also offers new perspectives on DN diagnosis and treatment, specifically targeting fibrosis and propionate metabolism pathways.Keywords: diabetic nephropathy, deep neural networks, gradient boosting machines (GBM), XGBoost
Procedia PDF Downloads 92848 Advertising Incentives of National Brands against Private Labels: The Case of OTC Heartburn Drugs
Authors: Lu Liao
Abstract:
The worldwide expansion of private labels over the past two decades not only transformed the choice sets of consumers but also forced manufacturers of national brands to design new marketing strategies to maintain their market positions. This paper empirically analyzes the impact of private labels on advertising incentives of national brands. The paper first develops a consumer demand model that incorporates spillover effects of advertising and finds positive spillovers of national brands’ advertising on demand for private label products. With the demand estimates, the researcher simulates the equilibrium prices and advertising levels for leading national brands in a counterfactual where private labels are eliminated to quantify the changes in national brands’ advertising incentives in response to the rise of private labels.Keywords: advertising, demand estimation, spillover effect, structural model
Procedia PDF Downloads 232847 Examining the Links between Fish Behaviour and Physiology for Resilience in the Anthropocene
Authors: Lauren A. Bailey, Amber R. Childs, Nicola C. James, Murray I. Duncan, Alexander Winkler, Warren M. Potts
Abstract:
Changes in behaviour and physiology are the most important responses of marine life to anthropogenic impacts such as climate change and over-fishing. Behavioural changes (such as a shift in distribution or changes in phenology) can ensure that a species remains in an environment suited for its optimal physiological performance. However, if marine life is unable to shift their distribution, they are reliant on physiological adaptation (either by broadening their metabolic curves to tolerate a range of stressors or by shifting their metabolic curves to maximize their performance at extreme stressors). However, since there are links between fish physiology and behaviour, changes to either of these traits may have reciprocal interactions. This paper reviews the current knowledge of the links between the behaviour and physiology of fishes, discusses these in the context of exploitation and climate change, and makes recommendations for future research needs. The review revealed that our understanding of the links between fish behaviour and physiology is rudimentary. However, both are hypothesized to be linked to stress responses along the hypothalamic pituitary axis. The link between physiological capacity and behaviour is particularly important as both determine the response of an individual to a changing climate and are under selection by fisheries. While it appears that all types of capture fisheries are likely to reduce the adaptive potential of fished populations to climate stressors, angling, which is primarily associated with recreational fishing, may induce fission of natural populations by removing individuals with bold behavioural traits and potentially the physiological traits required to facilitate behavioural change. Future research should focus on assessing how the links between physiological capacity and behaviour influence catchability, the response to climate change drivers, and post-release recovery. The plasticity of phenotypic traits should be examined under a range of stressors of differing intensity in several species and life history stages. Future studies should also assess plasticity (fission or fusion) in the phenotypic structuring of social hierarchy and how this influences habitat selection. Ultimately, to fully understand how physiology is influenced by the selective processes driven by fisheries, long-term monitoring of the physiological and behavioural structure of fished populations, their fitness, and catch rates are required.Keywords: climate change, metabolic shifts, over-fishing, phenotypic plasticity, stress response
Procedia PDF Downloads 1182846 A Bayesian Model with Improved Prior in Extreme Value Problems
Authors: Eva L. Sanjuán, Jacinto Martín, M. Isabel Parra, Mario M. Pizarro
Abstract:
In Extreme Value Theory, inference estimation for the parameters of the distribution is made employing a small part of the observation values. When block maxima values are taken, many data are discarded. We developed a new Bayesian inference model to seize all the information provided by the data, introducing informative priors and using the relations between baseline and limit parameters. Firstly, we studied the accuracy of the new model for three baseline distributions that lead to a Gumbel extreme distribution: Exponential, Normal and Gumbel. Secondly, we considered mixtures of Normal variables, to simulate practical situations when data do not adjust to pure distributions, because of perturbations (noise).Keywords: bayesian inference, extreme value theory, Gumbel distribution, highly informative prior
Procedia PDF Downloads 1982845 Spare Part Carbon Footprint Reduction with Reman Applications
Authors: Enes Huylu, Sude Erkin, Nur A. Özdemir, Hatice K. Güney, Cemre S. Atılgan, Hüseyin Y. Altıntaş, Aysemin Top, Muammer Yılman, Özak Durmuş
Abstract:
Remanufacturing (reman) applications allow manufacturers to contribute to the circular economy and help to introduce products with almost the same quality, environment-friendly, and lower cost. The objective of this study is to present that the carbon footprint of automotive spare parts used in vehicles could be reduced by reman applications based on Life Cycle Analysis which was framed with ISO 14040 principles. In that case, it was aimed to investigate reman applications for 21 parts in total. So far, research and calculations have been completed for the alternator, turbocharger, starter motor, compressor, manual transmission, auto transmission, and DPF (diesel particulate filter) parts, respectively. Since the aim of Ford Motor Company and Ford OTOSAN is to achieve net zero based on Science-Based Targets (SBT) and the Green Deal that the European Union sets out to make it climate neutral by 2050, the effects of reman applications are researched. In this case, firstly, remanufacturing articles available in the literature were searched based on the yearly high volume of spare parts sold. Paper review results related to their material composition and emissions released during incoming production and remanufacturing phases, the base part has been selected to take it as a reference. Then, the data of the selected base part from the research are used to make an approximate estimation of the carbon footprint reduction of the relevant part used in Ford OTOSAN. The estimation model is based on the weight, and material composition of the referenced paper reman activity. As a result of this study, it was seen that remanufacturing applications are feasible to apply technically and environmentally since it has significant effects on reducing the emissions released during the production phase of the vehicle components. For this reason, the research and calculations of the total number of targeted products in yearly volume have been completed to a large extent. Thus, based on the targeted parts whose research has been completed, in line with the net zero targets of Ford Motor Company and Ford OTOSAN by 2050, if remanufacturing applications are preferred instead of recent production methods, it is possible to reduce a significant amount of the associated greenhouse gas (GHG) emissions of spare parts used in vehicles. Besides, it is observed that remanufacturing helps to reduce the waste stream and causes less pollution than making products from raw materials by reusing the automotive components.Keywords: greenhouse gas emissions, net zero targets, remanufacturing, spare parts, sustainability
Procedia PDF Downloads 822844 Estimation of Soil Nutrient Content Using Google Earth and Pleiades Satellite Imagery for Small Farms
Authors: Lucas Barbosa Da Silva, Jun Okamoto Jr.
Abstract:
Precision Agriculture has long being benefited from crop fields’ aerial imagery. This important tool has allowed identifying patterns in crop fields, generating useful information to the production management. Reflectance intensity data in different ranges from the electromagnetic spectrum may indicate presence or absence of nutrients in the soil of an area. Different relations between the different light bands may generate even more detailed information. The knowledge of the nutrients content in the soil or in the crop during its growth is a valuable asset to the farmer that seeks to optimize its yield. However, small farmers in Brazil often lack the resources to access this kind information, and, even when they do, it is not presented in a comprehensive and/or objective way. So, the challenges of implementing this technology ranges from the sampling of the imagery, using aerial platforms, building of a mosaic with the images to cover the entire crop field, extracting the reflectance information from it and analyzing its relationship with the parameters of interest, to the display of the results in a manner that the farmer may take the necessary decisions more objectively. In this work, it’s proposed an analysis of soil nutrient contents based on image processing of satellite imagery and comparing its outtakes with commercial laboratory’s chemical analysis. Also, sources of satellite imagery are compared, to assess the feasibility of using Google Earth data in this application, and the impacts of doing so, versus the application of imagery from satellites like Landsat-8 and Pleiades. Furthermore, an algorithm for building mosaics is implemented using Google Earth imagery and finally, the possibility of using unmanned aerial vehicles is analyzed. From the data obtained, some soil parameters are estimated, namely, the content of Potassium, Phosphorus, Boron, Manganese, among others. The suitability of Google Earth Imagery for this application is verified within a reasonable margin, when compared to Pleiades Satellite imagery and to the current commercial model. It is also verified that the mosaic construction method has little or no influence on the estimation results. Variability maps are created over the covered area and the impacts of the image resolution and sample time frame are discussed, allowing easy assessments of the results. The final results show that easy and cheaper remote sensing and analysis methods are possible and feasible alternatives for the small farmer, with little access to technological and/or financial resources, to make more accurate decisions about soil nutrient management.Keywords: remote sensing, precision agriculture, mosaic, soil, nutrient content, satellite imagery, aerial imagery
Procedia PDF Downloads 1752843 The Analysis of TRACE/PARCS in the Simulation of Ultimate Response Guideline for Lungmen ABWR
Authors: J. R. Wang, W. Y. Li, H. T. Lin, B. H. Lee, C. Shih, S. W. Chen
Abstract:
In this research, the TRACE/PARCS model of Lungmen ABWR has been developed for verification of ultimate response guideline (URG) efficiency. This ultimate measure was named as DIVing plan, abbreviated from system depressurization, water injection and containment venting. The simulation initial condition is 100% rated power/100% rated core flow. This research focuses on the estimation of the time when the fuel might be damaged with no water injection by using TRACE/PARCS first. Then, the effect of the reactor core isolation system (RCIC), control depressurization and ac-independent water addition system (ACIWA), which can provide the injection with 950 gpm are also estimated for the station blackout (SBO) transient.Keywords: ABWR, TRACE, safety analysis, PARCS
Procedia PDF Downloads 4552842 A Design of the Organic Rankine Cycle for the Low Temperature Waste Heat
Abstract:
A presentation of the design of the Organic Rankine Cycle (ORC) with heat regeneration and super-heating processes is a subject of this paper. The maximum temperature level in the ORC is considered to be 110°C and the maximum pressure varies up to 2.5MPa. The selection process of the appropriate working fluids, thermal design and calculation of the cycle and its components are described. With respect to the safety, toxicity, flammability, price and thermal cycle efficiency, the working fluid selected is R134a. As a particular example, the thermal design of the condenser used for the ORC engine with a theoretical thermal power of 179 kW was introduced. The minimal heat transfer area for a completed condensation was determined to be approximately 520m2.Keywords: organic rankine cycle, thermal efficiency, working fluids, environmental engineering
Procedia PDF Downloads 4602841 Optimal ECG Sampling Frequency for Multiscale Entropy-Based HRV
Authors: Manjit Singh
Abstract:
Multiscale entropy (MSE) is an extensively used index to provide a general understanding of multiple complexity of physiologic mechanism of heart rate variability (HRV) that operates on a wide range of time scales. Accurate selection of electrocardiogram (ECG) sampling frequency is an essential concern for clinically significant HRV quantification; high ECG sampling rate increase memory requirements and processing time, whereas low sampling rate degrade signal quality and results in clinically misinterpreted HRV. In this work, the impact of ECG sampling frequency on MSE based HRV have been quantified. MSE measures are found to be sensitive to ECG sampling frequency and effect of sampling frequency will be a function of time scale.Keywords: ECG (electrocardiogram), heart rate variability (HRV), multiscale entropy, sampling frequency
Procedia PDF Downloads 2712840 Copula-Based Estimation of Direct and Indirect Effects in Path Analysis Models
Authors: Alam Ali, Ashok Kumar Pathak
Abstract:
Path analysis is a statistical technique used to evaluate the direct and indirect effects of variables in path models. One or more structural regression equations are used to estimate a series of parameters in path models to find the better fit of data. However, sometimes the assumptions of classical regression models, such as ordinary least squares (OLS), are violated by the nature of the data, resulting in insignificant direct and indirect effects of exogenous variables. This article aims to explore the effectiveness of a copula-based regression approach as an alternative to classical regression, specifically when variables are linked through an elliptical copula.Keywords: path analysis, copula-based regression models, direct and indirect effects, k-fold cross validation technique
Procedia PDF Downloads 412839 Economic Assessment Methodology to Support Decisions for Transport Infrastructure Development
Authors: Dimitrios J. Dimitriou
Abstract:
The decades after the end of the second War provide evidence that infrastructures investments contibute to economic development, on terms of productivity and income growth. In order to force productivity and increase competitiveness the financing of large transport infrastructure projects are on the top of the agenda in strategic planning process. Such a decision may take form some days to some decades and stakeholders as well as decision makers need tools in order to estimate the economic impact on natioanl economy of such an investment. The key question in such decisions is if the effects caused by the new infrastructure could be able to boost economic development on one hand, and create new jobs and activities on the other. This paper deals with the review of estimation of the mega transport infrastructure projects economic effects in economy.Keywords: economic impact, transport infrastructure, strategic planning, decision making
Procedia PDF Downloads 2902838 Applying Sequential Pattern Mining to Generate Block for Scheduling Problems
Authors: Meng-Hui Chen, Chen-Yu Kao, Chia-Yu Hsu, Pei-Chann Chang
Abstract:
The main idea in this paper is using sequential pattern mining to find the information which is helpful for finding high performance solutions. By combining this information, it is defined as blocks. Using the blocks to generate artificial chromosomes (ACs) could improve the structure of solutions. Estimation of Distribution Algorithms (EDAs) is adapted to solve the combinatorial problems. Nevertheless many of these approaches are advantageous for this application, but only some of them are used to enhance the efficiency of application. Generating ACs uses patterns and EDAs could increase the diversity. According to the experimental result, the algorithm which we proposed has a better performance to solve the permutation flow-shop problems.Keywords: combinatorial problems, sequential pattern mining, estimationof distribution algorithms, artificial chromosomes
Procedia PDF Downloads 6112837 Using SNAP and RADTRAD to Establish the Analysis Model for Maanshan PWR Plant
Authors: J. R. Wang, H. C. Chen, C. Shih, S. W. Chen, J. H. Yang, Y. Chiang
Abstract:
In this study, we focus on the establishment of the analysis model for Maanshan PWR nuclear power plant (NPP) by using RADTRAD and SNAP codes with the FSAR, manuals, and other data. In order to evaluate the cumulative dose at the Exclusion Area Boundary (EAB) and Low Population Zone (LPZ) outer boundary, Maanshan NPP RADTRAD/SNAP model was used to perform the analysis of the DBA LOCA case. The analysis results of RADTRAD were similar to FSAR data. These analysis results were lower than the failure criteria of 10 CFR 100.11 (a total radiation dose to the whole body, 250 mSv; a total radiation dose to the thyroid from iodine exposure, 3000 mSv).Keywords: RADionuclide, transport, removal, and dose estimation (RADTRAD), symbolic nuclear analysis package (SNAP), dose, PWR
Procedia PDF Downloads 4642836 Impairments Correction of Six-Port Based Millimeter-Wave Radar
Authors: Dan Ohev Zion, Alon Cohen
Abstract:
In recent years, the presence of short-range millimeter-wave radar in civil application has increased significantly. Autonomous driving, security, 3D imaging and high data rate communication systems are a few examples. The next challenge is the integration inside small form-factor devices, such as smartphones (e.g. gesture recognition). The main challenge is implementation of a truly low-power, low-complexity high-resolution radar. The most popular approach is the Frequency Modulated Continuous Wave (FMCW) radar, with an analog multiplication front-end. In this paper, we present an approach for adaptive estimation and correction of impairments of such front-end, specifically implemented using the Six-Port Device (SPD) as the multiplier element. The proposed algorithm was simulated and implemented on a 60 GHz radar lab prototype.Keywords: radar, FMCW Radar, IQ mismatch, six port
Procedia PDF Downloads 1522835 Estimation of Stress Intensity Factors from near Crack Tip Field
Authors: Zhuang He, Andrei Kotousov
Abstract:
All current experimental methods for determination of stress intensity factors are based on the assumption that the state of stress near the crack tip is plane stress. Therefore, these methods rely on strain and displacement measurements made outside the near crack tip region affected by the three-dimensional effects or by process zone. In this paper, we develop and validate an experimental procedure for the evaluation of stress intensity factors from the measurements of the out-of-plane displacements in the surface area controlled by 3D effects. The evaluation of stress intensity factors is possible when the process zone is sufficiently small, and the displacement field generated by the 3D effects is fully encapsulated by K-dominance region.Keywords: digital image correlation, stress intensity factors, three-dimensional effects, transverse displacement
Procedia PDF Downloads 615