Search results for: slice thickness accuracy
3224 Comparison of White Sauce Prepared from Native and Chemically Modified Corn and Pearl Millet Starches
Authors: Marium Shaikh, Tahira M. Ali, Abid Hasnain
Abstract:
Physical and sensory properties of white sauces prepared from native and chemically modified corn and pearl millet starches were compared. Interestingly, no syneresis was observed in hydroxypropylated corn and pearl millet starch containing white sauce even after nine days of cold storage (4 °C), while other modifications also reduced the syneresis significantly in comparison to their native counterparts. White sauce containing succinylated corn starch showed least oil separation due to its greater emulsion stability. Light microscopy was used to visualize the size and shape of fat globules, and it was found that they were most homogenously distributed in succinylated and hydroxypropylated samples. Sensory results revealed that chemical modification of corn and pearl millet starch improved the consistency, thickness and overall acceptability of white sauces. Viscosity profiles showed that pasting parameters of native pearl millet starch are almost similar to native corn starch suggesting pearl millet starch as an alternative of corn starch. Also, white sauce prepared from modified pearl millet starch showed better cold storage stability in terms of various textural attributes like hardness, cohesiveness, chewiness, and springiness.Keywords: corn starch, pearl millet, hydroxypropylation, succinylation, white sauce
Procedia PDF Downloads 2883223 Engine Thrust Estimation by Strain Gauging of Engine Mount Assembly
Authors: Rohit Vashistha, Amit Kumar Gupta, G. P. Ravishankar, Mahesh P. Padwale
Abstract:
Accurate thrust measurement is required for aircraft during takeoff and after ski-jump. In a developmental aircraft, takeoff from ship is extremely critical and thrust produced by the engine should be known to the pilot before takeoff so that if thrust produced is not sufficient then take-off can be aborted and accident can be avoided. After ski-jump, thrust produced by engine is required because the horizontal speed of aircraft is less than the normal takeoff speed. Engine should be able to produce enough thrust to provide nominal horizontal takeoff speed to the airframe within prescribed time limit. The contemporary low bypass gas turbine engines generally have three mounts where the two side mounts transfer the engine thrust to the airframe. The third mount only takes the weight component. It does not take any thrust component. In the present method of thrust estimation, the strain gauging of the two side mounts is carried out. The strain produced at various power settings is used to estimate the thrust produced by the engine. The quarter Wheatstone bridge is used to acquire the strain data. The engine mount assembly is subjected to Universal Test Machine for determination of equivalent elasticity of assembly. This elasticity value is used in the analytical approach for estimation of engine thrust. The estimated thrust is compared with the test bed load cell thrust data. The experimental strain data is also compared with strain data obtained from FEM analysis. Experimental setup: The strain gauge is mounted on the tapered portion of the engine mount sleeve. Two strain gauges are mounted on diametrically opposite locations. Both of the strain gauges on the sleeve were in the horizontal plane. In this way, these strain gauges were not taking any strain due to the weight of the engine (except negligible strain due to material's poison's ratio) or the hoop's stress. Only the third mount strain gauge will show strain when engine is not running i.e. strain due to weight of engine. When engine starts running, all the load will be taken by the side mounts. The strain gauge on the forward side of the sleeve was showing a compressive strain and the strain gauge on the rear side of the sleeve shows a tensile strain. Results and conclusion: the analytical calculation shows that the hoop stresses dominate the bending stress. The estimated thrust by strain gauge shows good accuracy at higher power setting as compared to lower power setting. The accuracy of estimated thrust at max power setting is 99.7% whereas at lower power setting is 78%.Keywords: engine mounts, finite elements analysis, strain gauge, stress
Procedia PDF Downloads 4883222 Market Index Trend Prediction using Deep Learning and Risk Analysis
Authors: Shervin Alaei, Reza Moradi
Abstract:
Trading in financial markets is subject to risks due to their high volatilities. Here, using an LSTM neural network, and by doing some risk-based feature engineering tasks, we developed a method that can accurately predict trends of the Tehran stock exchange market index from a few days ago. Our test results have shown that the proposed method with an average prediction accuracy of more than 94% is superior to the other common machine learning algorithms. To the best of our knowledge, this is the first work incorporating deep learning and risk factors to accurately predict market trends.Keywords: deep learning, LSTM, trend prediction, risk management, artificial neural networks
Procedia PDF Downloads 1613221 Predicting Costs in Construction Projects with Machine Learning: A Detailed Study Based on Activity-Level Data
Authors: Soheila Sadeghi
Abstract:
Construction projects are complex and often subject to significant cost overruns due to the multifaceted nature of the activities involved. Accurate cost estimation is crucial for effective budget planning and resource allocation. Traditional methods for predicting overruns often rely on expert judgment or analysis of historical data, which can be time-consuming, subjective, and may fail to consider important factors. However, with the increasing availability of data from construction projects, machine learning techniques can be leveraged to improve the accuracy of overrun predictions. This study applied machine learning algorithms to enhance the prediction of cost overruns in a case study of a construction project. The methodology involved the development and evaluation of two machine learning models: Random Forest and Neural Networks. Random Forest can handle high-dimensional data, capture complex relationships, and provide feature importance estimates. Neural Networks, particularly Deep Neural Networks (DNNs), are capable of automatically learning and modeling complex, non-linear relationships between input features and the target variable. These models can adapt to new data, reduce human bias, and uncover hidden patterns in the dataset. The findings of this study demonstrate that both Random Forest and Neural Networks can significantly improve the accuracy of cost overrun predictions compared to traditional methods. The Random Forest model also identified key cost drivers and risk factors, such as changes in the scope of work and delays in material delivery, which can inform better project risk management. However, the study acknowledges several limitations. First, the findings are based on a single construction project, which may limit the generalizability of the results to other projects or contexts. Second, the dataset, although comprehensive, may not capture all relevant factors influencing cost overruns, such as external economic conditions or political factors. Third, the study focuses primarily on cost overruns, while schedule overruns are not explicitly addressed. Future research should explore the application of machine learning techniques to a broader range of projects, incorporate additional data sources, and investigate the prediction of both cost and schedule overruns simultaneously.Keywords: cost prediction, machine learning, project management, random forest, neural networks
Procedia PDF Downloads 653220 Local Binary Patterns-Based Statistical Data Analysis for Accurate Soccer Match Prediction
Authors: Mohammad Ghahramani, Fahimeh Saei Manesh
Abstract:
Winning a soccer game is based on thorough and deep analysis of the ongoing match. On the other hand, giant gambling companies are in vital need of such analysis to reduce their loss against their customers. In this research work, we perform deep, real-time analysis on every soccer match around the world that distinguishes our work from others by focusing on particular seasons, teams and partial analytics. Our contributions are presented in the platform called “Analyst Masters.” First, we introduce various sources of information available for soccer analysis for teams around the world that helped us record live statistical data and information from more than 50,000 soccer matches a year. Our second and main contribution is to introduce our proposed in-play performance evaluation. The third contribution is developing new features from stable soccer matches. The statistics of soccer matches and their odds before and in-play are considered in the image format versus time including the halftime. Local Binary patterns, (LBP) is then employed to extract features from the image. Our analyses reveal incredibly interesting features and rules if a soccer match has reached enough stability. For example, our “8-minute rule” implies if 'Team A' scores a goal and can maintain the result for at least 8 minutes then the match would end in their favor in a stable match. We could also make accurate predictions before the match of scoring less/more than 2.5 goals. We benefit from the Gradient Boosting Trees, GBT, to extract highly related features. Once the features are selected from this pool of data, the Decision trees decide if the match is stable. A stable match is then passed to a post-processing stage to check its properties such as betters’ and punters’ behavior and its statistical data to issue the prediction. The proposed method was trained using 140,000 soccer matches and tested on more than 100,000 samples achieving 98% accuracy to select stable matches. Our database from 240,000 matches shows that one can get over 20% betting profit per month using Analyst Masters. Such consistent profit outperforms human experts and shows the inefficiency of the betting market. Top soccer tipsters achieve 50% accuracy and 8% monthly profit in average only on regional matches. Both our collected database of more than 240,000 soccer matches from 2012 and our algorithm would greatly benefit coaches and punters to get accurate analysis.Keywords: soccer, analytics, machine learning, database
Procedia PDF Downloads 2403219 Rangeland Monitoring by Computerized Technologies
Abstract:
Every piece of rangeland has a different set of physical and biological characteristics. This requires the manager to synthesis various information for regular monitoring to define changes trend to get wright decision for sustainable management. So range managers need to use computerized technologies to monitor rangeland, and select. The best management practices. There are four examples of computerized technologies that can benefit sustainable management: (1) Photographic method for cover measurement: The method was tested in different vegetation communities in semi humid and arid regions. Interpretation of pictures of quadrats was done using Arc View software. Data analysis was done by SPSS software using paired t test. Based on the results, generally, photographic method can be used to measure ground cover in most vegetation communities. (2) GPS application for corresponding ground samples and satellite pixels: In two provinces of Tehran and Markazi, six reference points were selected and in each point, eight GPS models were tested. Significant relation among GPS model, time and location with accuracy of estimated coordinates was found. After selection of suitable method, in Markazi province coordinates of plots along four transects in each 6 sites of rangelands was recorded. The best time of GPS application was in the morning hours, Etrex Vista had less error than other models, and a significant relation among GPS model, time and location with accuracy of estimated coordinates was found. (3) Application of satellite data for rangeland monitoring: Focusing on the long term variation of vegetation parameters such as vegetation cover and production is essential. Our study in grass and shrub lands showed that there were significant correlations between quantitative vegetation characteristics and satellite data. So it is possible to monitor rangeland vegetation using digital data for sustainable utilization. (4) Rangeland suitability classification with GIS: Range suitability assessment can facilitate sustainable management planning. Three sub-models of sensitivity to erosion, water suitability and forage production out puts were entered to final range suitability classification model. GIS was facilitate classification of range suitability and produced suitability maps for sheep grazing. Generally digital computers assist range managers to interpret, modify, calibrate or integrating information for correct management.Keywords: computer, GPS, GIS, remote sensing, photographic method, monitoring, rangeland ecosystem, management, suitability, sheep grazing
Procedia PDF Downloads 3713218 Development of a Model Based on Wavelets and Matrices for the Treatment of Weakly Singular Partial Integro-Differential Equations
Authors: Somveer Singh, Vineet Kumar Singh
Abstract:
We present a new model based on viscoelasticity for the Non-Newtonian fluids.We use a matrix formulated algorithm to approximate solutions of a class of partial integro-differential equations with the given initial and boundary conditions. Some numerical results are presented to simplify application of operational matrix formulation and reduce the computational cost. Convergence analysis, error estimation and numerical stability of the method are also investigated. Finally, some test examples are given to demonstrate accuracy and efficiency of the proposed method.Keywords: Legendre Wavelets, operational matrices, partial integro-differential equation, viscoelasticity
Procedia PDF Downloads 3403217 A Machine Learning Approach for Efficient Resource Management in Construction Projects
Authors: Soheila Sadeghi
Abstract:
Construction projects are complex and often subject to significant cost overruns due to the multifaceted nature of the activities involved. Accurate cost estimation is crucial for effective budget planning and resource allocation. Traditional methods for predicting overruns often rely on expert judgment or analysis of historical data, which can be time-consuming, subjective, and may fail to consider important factors. However, with the increasing availability of data from construction projects, machine learning techniques can be leveraged to improve the accuracy of overrun predictions. This study applied machine learning algorithms to enhance the prediction of cost overruns in a case study of a construction project. The methodology involved the development and evaluation of two machine learning models: Random Forest and Neural Networks. Random Forest can handle high-dimensional data, capture complex relationships, and provide feature importance estimates. Neural Networks, particularly Deep Neural Networks (DNNs), are capable of automatically learning and modeling complex, non-linear relationships between input features and the target variable. These models can adapt to new data, reduce human bias, and uncover hidden patterns in the dataset. The findings of this study demonstrate that both Random Forest and Neural Networks can significantly improve the accuracy of cost overrun predictions compared to traditional methods. The Random Forest model also identified key cost drivers and risk factors, such as changes in the scope of work and delays in material delivery, which can inform better project risk management. However, the study acknowledges several limitations. First, the findings are based on a single construction project, which may limit the generalizability of the results to other projects or contexts. Second, the dataset, although comprehensive, may not capture all relevant factors influencing cost overruns, such as external economic conditions or political factors. Third, the study focuses primarily on cost overruns, while schedule overruns are not explicitly addressed. Future research should explore the application of machine learning techniques to a broader range of projects, incorporate additional data sources, and investigate the prediction of both cost and schedule overruns simultaneously.Keywords: resource allocation, machine learning, optimization, data-driven decision-making, project management
Procedia PDF Downloads 453216 Analysis of Experimentally Designed Soundproof Gypsum Partition Wall's Sections in Terms of Structural Engineering
Authors: Abdulkerim Ilgun, Ahmad Javid Zia
Abstract:
In developing countries, the urban populations are increasing rapidly and with this increment the residential areas are experiencing major problems. Construction of high-rise buildings in confined spaces is one of the most practical solutions for this problem. However, by living in high-rise buildings and sharing common residential areas, residents will face many problems. Irritating sound problem which is known as noise is one of the major problems mentioned above. The second most important problem is the weight of the high-rise buildings which makes the structure more vulnerable to earthquakes. To decrease earthquake loads it’s very important to decrease the weight of the buildings. To solve the problem of noise and keep the building weight at minimum level, experimentally designed soundproof gypsum partition wall which has optimum thickness has been used in high-rise story building and the results have been compared with ordinary brick partition walls. In this compression the effect of weights of soundproof gypsum walls and ordinary brick walls in accordance to structural engineering have been investigated.Keywords: cellubor, gypsum board, gypsum partition walls, light partition walls, noise, sound
Procedia PDF Downloads 3083215 The Optimal Location of Brickforce in Brickwork
Authors: Sandile Daniel Ngidi
Abstract:
A brickforce is a product consisting of two main parallel wires joined by in-line welded cross wires. Embedded in the normal thickness of the brickwork joint, the wires are manufactured to a flattened profile to simplify location into the mortar joint without steel build-up problems at lap positions corners/junctions or when used in conjunction with wall ties. A brickforce has been in continuous use since 1918. It is placed in the cement between courses of bricks. Brickforce is used in every course of the foundations and every course above lintel height. Otherwise, brickforce is used every fourth course in between the foundations and lintel height or a concrete slab and lintel height. The brickforce strengthens and stabilizes the wall, especially if you are building on unstable ground. It provides brickwork increased resistance to tensional stresses. Brickforce uses high tensile steel wires, which can withstand high forces but with a very little stretch. This helps to keep crack widths to a minimum. Recently a debate has opened about the purpose of using brickforce in single-story buildings. The debate has been compounded by the fact that there is no consensus about the spacing of brickforce in brickwork or masonry. In addition, very little information had been published on the relative merits of using the same size of brickforce for the different atmospheric conditions in South Africa. This paper aims to compare different types of brickforce systems used in different countries. Conclusions are made to identify the point and location of brickforce that optimize the system.Keywords: brickforce, masonry concrete, reinforcement, strengthening, wall panels
Procedia PDF Downloads 2333214 Free Shape Optimisation of Cold Formed Steel Sections
Authors: Mina Mortazavi, Pezhman Sharafi
Abstract:
Cold-formed steel sections are popular construction materials as structural or non-structural elements. The objective of this paper is to propose an optimisation method for open cross sections targeting the maximum nominal axial strength. The cross sections considered in the optimisation process should all meet a determined critical global buckling load to be considered as a candidate for optimisation process. The maximum dimensions of the cross section are fixed and limited into a predefined rectangular area. The optimisation process is repeated for different available coil thicknesses of 1 mm, 2.5 mm and 3 mm to determine the optimum thickness according to the cross section buckling behaviour. A simple-simple boundary is assumed as end conditions. The number of folds is limited to 20 folds to prevent extra complicated sections. The global buckling load is considered as Euler load and is determined according to the moment of inertia of the cross-section with a constant length. The critical buckling loads are obtained using Finite Strip Method. The results of the optimisation analysis are provided, and the optimum cross-section within the considered range is determined.Keywords: shape optimisation, buckling, cold formed steel, finite strip method
Procedia PDF Downloads 4023213 Close Loop Controlled Current Nerve Locator
Authors: H. A. Alzomor, B. K. Ouda, A. M. Eldeib
Abstract:
Successful regional anesthesia depends upon precise location of the peripheral nerve or nerve plexus. Locating peripheral nerves is preferred to be done using nerve stimulation. In order to generate a nerve impulse by electrical means, a minimum threshold stimulus of current “rheobase” must be applied to the nerve. The technique depends on stimulating muscular twitching at a close distance to the nerve without actually touching it. Success rate of this operation depends on the accuracy of current intensity pulses used for stimulation. In this paper, we will discuss a circuit and algorithm for closed loop control for the current, theoretical analysis and test results and compare them with previous techniques.Keywords: Close Loop Control (CLC), constant current, nerve locator, rheobase
Procedia PDF Downloads 2593212 The Staphylococcus aureus Exotoxin Recognition Using Nanobiosensor Designed by an Antibody-Attached Nanosilica Method
Authors: Hamed Ahari, Behrouz Akbari Adreghani, Vadood Razavilar, Amirali Anvar, Sima Moradi, Hourieh Shalchi
Abstract:
Considering the ever increasing population and industrialization of the developmental trend of humankind's life, we are no longer able to detect the toxins produced in food products using the traditional techniques. This is due to the fact that the isolation time for food products is not cost-effective and even in most of the cases, the precision in the practical techniques like the bacterial cultivation and other techniques suffer from operator errors or the errors of the mixtures used. Hence with the advent of nanotechnology, the design of selective and smart sensors is one of the greatest industrial revelations of the quality control of food products that in few minutes time, and with a very high precision can identify the volume and toxicity of the bacteria. Methods and Materials: In this technique, based on the bacterial antibody connection to nanoparticle, a sensor was used. In this part of the research, as the basis for absorption for the recognition of bacterial toxin, medium sized silica nanoparticles of 10 nanometer in form of solid powder were utilized with Notrino brand. Then the suspension produced from agent-linked nanosilica which was connected to bacterial antibody was positioned near the samples of distilled water, which were contaminated with Staphylococcus aureus bacterial toxin with the density of 10-3, so that in case any toxin exists in the sample, a connection between toxin antigen and antibody would be formed. Finally, the light absorption related to the connection of antigen to the particle attached antibody was measured using spectrophotometry. The gene of 23S rRNA that is conserved in all Staphylococcus spp., also used as control. The accuracy of the test was monitored by using serial dilution (l0-6) of overnight cell culture of Staphylococcus spp., bacteria (OD600: 0.02 = 107 cell). It showed that the sensitivity of PCR is 10 bacteria per ml of cells within few hours. Result: The results indicate that the sensor detects up to 10-4 density. Additionally, the sensitivity of the sensors was examined after 60 days, the sensor by the 56 days had confirmatory results and started to decrease after those time periods. Conclusions: Comparing practical nano biosensory to conventional methods like that culture and biotechnology methods(such as polymerase chain reaction) is accuracy, sensitiveness and being unique. In the other way, they reduce the time from the hours to the 30 minutes.Keywords: exotoxin, nanobiosensor, recognition, Staphylococcus aureus
Procedia PDF Downloads 3893211 Next-Generation Lunar and Martian Laser Retro-Reflectors
Authors: Simone Dell'Agnello
Abstract:
There are laser retroreflectors on the Moon and no laser retroreflectors on Mars. Here we describe the design, construction, qualification and imminent deployment of next-generation, optimized laser retroreflectors on the Moon and on Mars (where they will be the first ones). These instruments are positioned by time-of-flight measurements of short laser pulses, the so-called 'laser ranging' technique. Data analysis is carried out with PEP, the Planetary Ephemeris Program of CfA (Center for Astrophysics). Since 1969 Lunar Laser Ranging (LLR) to Apollo/Lunokhod laser retro-reflector (CCR) arrays supplied accurate tests of General Relativity (GR) and new gravitational physics: possible changes of the gravitational constant Gdot/G, weak and strong equivalence principle, gravitational self-energy (Parametrized Post Newtonian parameter beta), geodetic precession, inverse-square force-law; it can also constraint gravitomagnetism. Some of these measurements also allowed for testing extensions of GR, including spacetime torsion, non-minimally coupled gravity. LLR has also provides significant information on the composition of the deep interior of the Moon. In fact, LLR first provided evidence of the existence of a fluid component of the deep lunar interior. In 1969 CCR arrays contributed a negligible fraction of the LLR error budget. Since laser station range accuracy improved by more than a factor 100, now, because of lunar librations, current array dominate the error due to their multi-CCR geometry. We developed a next-generation, single, large CCR, MoonLIGHT (Moon Laser Instrumentation for General relativity high-accuracy test) unaffected by librations that supports an improvement of the space segment of the LLR accuracy up to a factor 100. INFN also developed INRRI (INstrument for landing-Roving laser Retro-reflector Investigations), a microreflector to be laser-ranged by orbiters. Their performance is characterized at the SCF_Lab (Satellite/lunar laser ranging Characterization Facilities Lab, INFN-LNF, Frascati, Italy) for their deployment on the lunar surface or the cislunar space. They will be used to accurately position landers, rovers, hoppers, orbiters of Google Lunar X Prize and space agency missions, thanks to LLR observations from station of the International Laser Ranging Service in the USA, in France and in Italy. INRRI was launched in 2016 with the ESA mission ExoMars (Exobiology on Mars) EDM (Entry, descent and landing Demonstration Module), deployed on the Schiaparelli lander and is proposed for the ExoMars 2020 Rover. Based on an agreement between NASA and ASI (Agenzia Spaziale Italiana), another microreflector, LaRRI (Laser Retro-Reflector for InSight), was delivered to JPL (Jet Propulsion Laboratory) and integrated on NASA’s InSight Mars Lander in August 2017 (launch scheduled in May 2018). Another microreflector, LaRA (Laser Retro-reflector Array) will be delivered to JPL for deployment on the NASA Mars 2020 Rover. The first lunar landing opportunities will be from early 2018 (with TeamIndus) to late 2018 with commercial missions, followed by opportunities with space agency missions, including the proposed deployment of MoonLIGHT and INRRI on NASA’s Resource Prospectors and its evolutions. In conclusion, we will extend significantly the CCR Lunar Geophysical Network and populate the Mars Geophysical Network. These networks will enable very significantly improved tests of GR.Keywords: general relativity, laser retroreflectors, lunar laser ranging, Mars geodesy
Procedia PDF Downloads 2733210 Biological Soil Crust Effects on Dust Control Around the Urmia Lake
Authors: Abbas Ahmadi, Nasser Aliasgharzad, Ali Asghar Jafarzadeh
Abstract:
Nowadays, drying of the Urmia Lake as a largest saline lake in the world and emerging its saline bed from water has caused the risk of salty dune storms, which threats the health of human society and also plants and animal communities living in the region. Biological soil crusts (BSCs) as a dust stabilizer attracted the attention of Soil conservation experts in recent years. Although the presence of water by the impenetrable lake bed and endorheic basin can be an advantage to create BSCs, but the extraordinary of the lake bed salinity is a factor for prevention of its establishment in the region. Therefore, the present research work has been carried out to investigate the effects of inoculating the Cyanobacteria, algae and their combination to create BSCs for dust control. In this study, an algae attributed to Chlamydomonas sp and a cyanobacteria attributed to Anabaena sp isolated from the soils of Urmia Lake margin were used to create BSC in four soil samples which collected from 0-10 cm of the current margin (A), the previous bed (B), affected lands by lake (C) and Quomtappe sand dune (D). The main characteristics of the A, B and C soil samples are their highly salinity (their ECe are 108, 140 and 118 dS/m, respectively) and sodicity. Also, texture class of the soil A was loamy sand, and other two soils had clay textures. Soil D was Non-saline, but it was sodic with a sandy texture class. This study was conducted separately in each soil in a completely randomized design under four inoculation treatments of non-inoculated (T0), Algae (T1), cyanobacteria (T2) and equal mixture of algae and cyanobacteria (T3) with three replications. In the experiment, the soil was placed into wind tunnel trays, and a suspension containing microorganisms mixed with the trays surface soil. During the experiment, water was sprayed to the trays at the morning and evening of every day. After passing the incubation period (30 days), some characteristics of samples such as pH, EC, cold water extractable carbohydrate (CWEC), hot water extractable carbohydrate (HWEC), sulfuric acid extractable carbohydrate (SAEC), organic matter, crust thickness, penetration resistance, wind erosion threshold velocity and soil loss in the wind tunnel were measured, and Correlation between the measured characteristics was obtained through the SPSS software. Analysis of variance and so comparison between the means of treatments were analyzed with MSTATC software. In this research, Chlorophyll, an amount, was used as an indicator of the microorganism's population in the samples. Based on obtained results, the amount of Chlorophyll a in the T2 treatment of soil A and all treatments of soil D was significantly increased in comparison to the control and crust thickness showed increase in all treatments by microorganism’s inoculation. But effect of the treatments was significant in soils A and D. At all treatment’s inoculation of microorganisms in soil A caused to increase %46, %34 and %55 of the wind erosion threshold velocity in T1, T2 and T3 treatments in comparison to the control, respectively, and in soil D all treatments caused wind erosion threshold velocity became two times more than control. However, soil loss in the wind tunnel experiments was significant in T2 and T3 treatments of these soils and T1 treatment had no effect in reducing soil loss. Correlation between Chlorophyll a and salinity shows the important role of salinity in microbial growth prevention and formation of BSCs in the studied samples. In general, according to the obtained results, it can be concluded that salinity reduces the growth of microorganisms in saline soils of the region, and in soils with fine textures, salinity role in prevention of the microbial growth is clear. Also, using the mix of algae and cyanobacteria together caused the synergistic growth of them and consequently, better protection of the soil against wind erosion was provided.Keywords: wind erosion, algae, cyanobacteria, carbohydrate
Procedia PDF Downloads 663209 High Resolution Satellite Imagery and Lidar Data for Object-Based Tree Species Classification in Quebec, Canada
Authors: Bilel Chalghaf, Mathieu Varin
Abstract:
Forest characterization in Quebec, Canada, is usually assessed based on photo-interpretation at the stand level. For species identification, this often results in a lack of precision. Very high spatial resolution imagery, such as DigitalGlobe, and Light Detection and Ranging (LiDAR), have the potential to overcome the limitations of aerial imagery. To date, few studies have used that data to map a large number of species at the tree level using machine learning techniques. The main objective of this study is to map 11 individual high tree species ( > 17m) at the tree level using an object-based approach in the broadleaf forest of Kenauk Nature, Quebec. For the individual tree crown segmentation, three canopy-height models (CHMs) from LiDAR data were assessed: 1) the original, 2) a filtered, and 3) a corrected model. The corrected CHM gave the best accuracy and was then coupled with imagery to refine tree species crown identification. When compared with photo-interpretation, 90% of the objects represented a single species. For modeling, 313 variables were derived from 16-band WorldView-3 imagery and LiDAR data, using radiance, reflectance, pixel, and object-based calculation techniques. Variable selection procedures were employed to reduce their number from 313 to 16, using only 11 bands to aid reproducibility. For classification, a global approach using all 11 species was compared to a semi-hierarchical hybrid classification approach at two levels: (1) tree type (broadleaf/conifer) and (2) individual broadleaf (five) and conifer (six) species. Five different model techniques were used: (1) support vector machine (SVM), (2) classification and regression tree (CART), (3) random forest (RF), (4) k-nearest neighbors (k-NN), and (5) linear discriminant analysis (LDA). Each model was tuned separately for all approaches and levels. For the global approach, the best model was the SVM using eight variables (overall accuracy (OA): 80%, Kappa: 0.77). With the semi-hierarchical hybrid approach, at the tree type level, the best model was the k-NN using six variables (OA: 100% and Kappa: 1.00). At the level of identifying broadleaf and conifer species, the best model was the SVM, with OA of 80% and 97% and Kappa values of 0.74 and 0.97, respectively, using seven variables for both models. This paper demonstrates that a hybrid classification approach gives better results and that using 16-band WorldView-3 with LiDAR data leads to more precise predictions for tree segmentation and classification, especially when the number of tree species is large.Keywords: tree species, object-based, classification, multispectral, machine learning, WorldView-3, LiDAR
Procedia PDF Downloads 1393208 Anisotropic Approach for Discontinuity Preserving in Optical Flow Estimation
Authors: Pushpendra Kumar, Sanjeev Kumar, R. Balasubramanian
Abstract:
Estimation of optical flow from a sequence of images using variational methods is one of the most successful approach. Discontinuity between different motions is one of the challenging problem in flow estimation. In this paper, we design a new anisotropic diffusion operator, which is able to provide smooth flow over a region and efficiently preserve discontinuity in optical flow. This operator is designed on the basis of intensity differences of the pixels and isotropic operator using exponential function. The combination of these are used to control the propagation of flow. Experimental results on the different datasets verify the robustness and accuracy of the algorithm and also validate the effect of anisotropic operator in the discontinuity preserving.Keywords: optical flow, variational methods, computer vision, anisotropic operator
Procedia PDF Downloads 8763207 The Two-Lane Rural Analysis and Comparison of Police Statistics and Results with the Help IHSDM
Authors: S. Amanpour, F. Mohamadian, S. A. Tabatabai
Abstract:
With the number of accidents and fatalities in recent years can be concluded that Iran is the status of road accidents, remains in a crisis. Investigate the causes of such incidents in all countries is a necessity. By doing this research, the results of the number and type of accidents and the location of the crash will be available. It is possible to prioritize economic and rational solutions to fix the flaws in the way of short-term the results are all the more strict rules about the desire to have black spots and cursory glance at the change of but results in long-term are desired to change the system or increase the width of the path or add extra track. In general, the relationship between the analysis of the accidents and near police statistics is the number of accidents in one year. This could prove the accuracy of the analysis done.Keywords: traffic, IHSDM, crash, modeling, Khuzestan
Procedia PDF Downloads 2873206 Fractional Order Differentiator Using Chebyshev Polynomials
Authors: Koushlendra Kumar Singh, Manish Kumar Bajpai, Rajesh Kumar Pandey
Abstract:
A discrete time fractional orderdifferentiator has been modeled for estimating the fractional order derivatives of contaminated signal. The proposed approach is based on Chebyshev’s polynomials. We use the Riemann-Liouville fractional order derivative definition for designing the fractional order SG differentiator. In first step we calculate the window weight corresponding to the required fractional order. Then signal is convoluted with this calculated window’s weight for finding the fractional order derivatives of signals. Several signals are considered for evaluating the accuracy of the proposed method.Keywords: fractional order derivative, chebyshev polynomials, signals, S-G differentiator
Procedia PDF Downloads 6513205 Two-Stage Estimation of Tropical Cyclone Intensity Based on Fusion of Coarse and Fine-Grained Features from Satellite Microwave Data
Authors: Huinan Zhang, Wenjie Jiang
Abstract:
Accurate estimation of tropical cyclone intensity is of great importance for disaster prevention and mitigation. Existing techniques are largely based on satellite imagery data, and research and utilization of the inner thermal core structure characteristics of tropical cyclones still pose challenges. This paper presents a two-stage tropical cyclone intensity estimation network based on the fusion of coarse and fine-grained features from microwave brightness temperature data. The data used in this network are obtained from the thermal core structure of tropical cyclones through the Advanced Technology Microwave Sounder (ATMS) inversion. Firstly, the thermal core information in the pressure direction is comprehensively expressed through the maximal intensity projection (MIP) method, constructing coarse-grained thermal core images that represent the tropical cyclone. These images provide a coarse-grained feature range wind speed estimation result in the first stage. Then, based on this result, fine-grained features are extracted by combining thermal core information from multiple view profiles with a distributed network and fused with coarse-grained features from the first stage to obtain the final two-stage network wind speed estimation. Furthermore, to better capture the long-tail distribution characteristics of tropical cyclones, focal loss is used in the coarse-grained loss function of the first stage, and ordinal regression loss is adopted in the second stage to replace traditional single-value regression. The selection of tropical cyclones spans from 2012 to 2021, distributed in the North Atlantic (NA) regions. The training set includes 2012 to 2017, the validation set includes 2018 to 2019, and the test set includes 2020 to 2021. Based on the Saffir-Simpson Hurricane Wind Scale (SSHS), this paper categorizes tropical cyclone levels into three major categories: pre-hurricane, minor hurricane, and major hurricane, with a classification accuracy rate of 86.18% and an intensity estimation error of 4.01m/s for NA based on this accuracy. The results indicate that thermal core data can effectively represent the level and intensity of tropical cyclones, warranting further exploration of tropical cyclone attributes under this data.Keywords: Artificial intelligence, deep learning, data mining, remote sensing
Procedia PDF Downloads 683204 In-situ Fabrication of a Metal-Intermetallic Composite: Microstructure Evolution and Mechanical Response
Authors: Monireh Azimi, Mohammad Reza Toroghinejad, Leo A. I. Kestens
Abstract:
The role of different metallic and intermetallic reinforcements on the microstructure and the associated mechanical response of a composite is of crucial importance. To investigate this issue, a multiphase metal-intermetallic composite was in-situ fabricated through reactive annealing and accumulative roll bonding (ARB) processes. EBSD results indicated that the lamellar grain structure of the Al matrix after the first cycle has evolved with increasing strain to a mixed structure consisting of equiaxed and lamellar grains, whereby the steady-state did not occur after the 3rd (last) cycle—applying a strain of 6.1 in the Al phase, the length and thickness of the grains reduced by 92.2% and 97.3%, respectively, compared to the annealed state. Intermetallic phases together with the metallic reinforcement of Ni influence grain fragmentation of the Al matrix and give rise to a specific texture evolution by creating heterogeneity in the strain and flow patterns. Mechanical properties of the multiphase composite demonstrated the yield and ultimate tensile strengths of 217.9 MPa and 340.1 MPa, respectively, compared to 48.7 MPa and 55.4 MPa in the metal-intermetallic laminated (MIL) sandwich before applying the ARB process, which corresponds to an increase of 347% and 514% of yield and tensile strength, respectively.Keywords: accumulative roll bonding, mechanical properties, metal-intermetallic composite, severe plastic deformation, texture
Procedia PDF Downloads 1973203 Surface Elevation Dynamics Assessment Using Digital Elevation Models, Light Detection and Ranging, GPS and Geospatial Information Science Analysis: Ecosystem Modelling Approach
Authors: Ali K. M. Al-Nasrawi, Uday A. Al-Hamdany, Sarah M. Hamylton, Brian G. Jones, Yasir M. Alyazichi
Abstract:
Surface elevation dynamics have always responded to disturbance regimes. Creating Digital Elevation Models (DEMs) to detect surface dynamics has led to the development of several methods, devices and data clouds. DEMs can provide accurate and quick results with cost efficiency, in comparison to the inherited geomatics survey techniques. Nowadays, remote sensing datasets have become a primary source to create DEMs, including LiDAR point clouds with GIS analytic tools. However, these data need to be tested for error detection and correction. This paper evaluates various DEMs from different data sources over time for Apple Orchard Island, a coastal site in southeastern Australia, in order to detect surface dynamics. Subsequently, 30 chosen locations were examined in the field to test the error of the DEMs surface detection using high resolution global positioning systems (GPSs). Results show significant surface elevation changes on Apple Orchard Island. Accretion occurred on most of the island while surface elevation loss due to erosion is limited to the northern and southern parts. Concurrently, the projected differential correction and validation method aimed to identify errors in the dataset. The resultant DEMs demonstrated a small error ratio (≤ 3%) from the gathered datasets when compared with the fieldwork survey using RTK-GPS. As modern modelling approaches need to become more effective and accurate, applying several tools to create different DEMs on a multi-temporal scale would allow easy predictions in time-cost-frames with more comprehensive coverage and greater accuracy. With a DEM technique for the eco-geomorphic context, such insights about the ecosystem dynamic detection, at such a coastal intertidal system, would be valuable to assess the accuracy of the predicted eco-geomorphic risk for the conservation management sustainability. Demonstrating this framework to evaluate the historical and current anthropogenic and environmental stressors on coastal surface elevation dynamism could be profitably applied worldwide.Keywords: DEMs, eco-geomorphic-dynamic processes, geospatial Information Science, remote sensing, surface elevation changes,
Procedia PDF Downloads 2693202 Design and Analysis of Enhanced Heat Transfer Kit for Plate Type Heat Exchanger
Authors: Muhammad Shahrukh Saeed, Syed Ahmad Nameer, Shafiq Ur Rehman, Aisha Jillani
Abstract:
Heat exchangers play a critical role in industrial applications of thermal systems. Its physical size and performance are vital parameters; therefore enhancement of heat transfer through different techniques remained a major research area for both academia and industry. This research reports the main purpose of heat exchanger with better kit design which plays a vital role during the process of heat transfer. Plate type heat exchanger mainly requires a design in which the plates can be easily be installed and removed without having any problem with the plates. For the flow of the fluid within the heat exchanger, it requires a flow should be fully developed. As natural laws allows the driving energy of the system to flow until equilibrium is achieved. As with a plate type heat exchanger heat the heat penetrates the surface which separates the hot medium with the cold one very easily. As some of the precautions should be considered while taking the heat exchanger accountable like heat should transfer from hot medium to cold, there should always be difference in temperature present and heat loss from hot body should be equal to the heat gained by the cold body regardless of the losses present to the surroundings. Aluminum plates of same grade are used in all experiments to ensure similarity. Size of all plates was 254 mm X 100 mm and thickness was taken as 5 mm.Keywords: heat transfer coefficient, aluminium, entry length, design
Procedia PDF Downloads 3363201 Optimisation of the Input Layer Structure for Feedforward Narx Neural Networks
Authors: Zongyan Li, Matt Best
Abstract:
This paper presents an optimization method for reducing the number of input channels and the complexity of the feed-forward NARX neural network (NN) without compromising the accuracy of the NN model. By utilizing the correlation analysis method, the most significant regressors are selected to form the input layer of the NN structure. An application of vehicle dynamic model identification is also presented in this paper to demonstrate the optimization technique and the optimal input layer structure and the optimal number of neurons for the neural network is investigated.Keywords: correlation analysis, F-ratio, levenberg-marquardt, MSE, NARX, neural network, optimisation
Procedia PDF Downloads 3753200 Relativity in Toddlers' Understanding of the Physical World as Key to Misconceptions in the Science Classroom
Authors: Michael Hast
Abstract:
Within their first year, infants can differentiate between objects based on their weight. By at least 5 years children hold consistent weight-related misconceptions about the physical world, such as that heavy things fall faster than lighter ones because of their weight. Such misconceptions are seen as a challenge for science education since they are often highly resistant to change through instruction. Understanding the time point of emergence of such ideas could, therefore, be crucial for early science pedagogy. The paper thus discusses two studies that jointly address the issue by examining young children’s search behaviour in hidden displacement tasks under consideration of relative object weight. In both studies, they were tested with a heavy or a light ball, and they either had information about one of the balls only or both. In Study 1, 88 toddlers aged 2 to 3½ years watched a ball being dropped into a curved tube and were then allowed to search for the ball in three locations – one straight beneath the tube entrance, one where the curved tube lead to, and one that corresponded to neither of the previous outcomes. Success and failure at the task were not impacted by weight of the balls alone in any particular way. However, from around 3 years onwards, relative lightness, gained through having tactile experience of both balls beforehand, enhanced search success. Conversely, relative heaviness increased search errors such that children increasingly searched in the location immediately beneath the tube entry – known as the gravity bias. In Study 2, 60 toddlers aged 2, 2½ and 3 years watched a ball roll down a ramp and behind a screen with four doors, with a barrier placed along the ramp after one of four doors. Toddlers were allowed to open the doors to find the ball. While search accuracy generally increased with age, relative weight did not play a role in 2-year-olds’ search behaviour. Relative lightness improved 2½-year-olds’ searches. At 3 years, both relative lightness and relative heaviness had a significant impact, with the former improving search accuracy and the latter reducing it. Taken together, both studies suggest that between 2 and 3 years of age, relative object weight is increasingly taken into consideration in navigating naïve physical concepts. In particular, it appears to contribute to the early emergence of misconceptions relating to object weight. This insight from developmental psychology research may have consequences for early science education and related pedagogy towards early conceptual change.Keywords: conceptual development, early science education, intuitive physics, misconceptions, object weight
Procedia PDF Downloads 1913199 Synthesis and Electromagnetic Wave Absorbing Property of Amorphous Carbon Nanotube Networks on a 3D Graphene Aerogel/BaFe₁₂O₁₉ Nanorod Composite
Authors: Tingkai Zhao, Jingtian Hu, Xiarong Peng, Wenbo Yang, Tiehu Li
Abstract:
Homogeneous amorphous carbon nanotube (ACNT) networks have been synthesized using floating catalyst chemical vapor deposition method on a three-dimensional (3D) graphene aerogel (GA)/BaFe₁₂O₁₉ nanorod (BNR) composite which prepared by a self-propagating combustion process. The as-synthesized ACNT/GA/BNR composite which has 3D network structures could be directly used as a good absorber in the electromagnetic wave absorbent materials. The experimental results indicated that the maximum absorbing peak of ACNT/GA/BNR composite with a thickness of 2 mm was -18.35 dB at 10.64 GHz in the frequency range of 2-18 GHz. The bandwidth of the reflectivity below -10 dB is 3.32 GHz. The 3D graphene aerogel structures which composed of dense interlined tubes and amorphous structure of ACNTs bearing quantities of dihedral angles could consume the incident waves through multiple reflection and scattering inside the 3D web structures. The interlinked ACNTs have both the virtues of amorphous CNTs (multiple reflections inside the wall) and crystalline CNTs (high conductivity), consuming the electromagnetic wave as resistance heat. ACNT/GA/BNR composite has a good electromagnetic wave absorbing performance.Keywords: amorphous carbon nanotubes, graphene aerogel, barium ferrite nanorod, electromagnetic wave absorption
Procedia PDF Downloads 2873198 Performance Improvement of SBR Polymer Concrete Used in Construction of Rigid Pavement Highway
Authors: Mohammed Abbas Al-Jumaili
Abstract:
There are some studies which have been conducted in resent years to investigate the possibility of producing high performance polymer concrete. However, despite the great important of this subject, very limited amount of literature is available about the strength and performance of this type of concrete in case using in rigid pavement highway. In this study, the possibility of producing high performance polymer concrete by using Styrene Butadiene Rubber (SBR) emulsion with various (SBR) percents of 5,10 ,15, and 20 % by weight of cement has been investigated. The compressive, splitting tensile and flexural strengths and dynamic modulus of elasticity tests were conducted after age of 7 and 28 days for control without polymer and SBR concretes. A total of (30) cubes, (30) cylinders and (30) prisms were prepared using different types of concrete mixes. The AASHTO guide-1993 method was used to determine slab concrete thickness of rigid pavement highway in case of using various SBR polymer concrete mixture types. The research results indicate that the use of 10% SBR by weight of cement leads to produce high performance concrete especially with regard to mechanical properties and structural relative to corresponding control concrete.Keywords: rigid pavement highway, styrene–butadiene rubber (SBR) latex, compressive test, splitting tensile test, flexural test and dynamic modulus of elasticity test
Procedia PDF Downloads 3293197 Calculation of Lungs Physiological Lung Motion in External Lung Irradiation
Authors: Yousif Mohamed Y. Abdallah, Khalid H. Eltom
Abstract:
This is an experimental study deals with measurement of the periodic physiological organ motion during lung external irradiation in order to reduce the exposure of healthy tissue during radiation treatments. The results showed for left lung displacement reading (4.52+1.99 mm) and right lung is (8.21+3.77 mm) which the radiotherapy physician should take suitable countermeasures in case of significant errors. The motion ranged between 2.13 mm and 12.2 mm (low and high). In conclusion, the calculation of tumour mobility can improve the accuracy of target areas definition in patients undergo Sterostatic RT for stage I, II and III lung cancer (NSCLC). Definition of the target volume based on a high resolution CT scan with a margin of 3-5 mm is appropriate.Keywords: physiological motion, lung, external irradiation, radiation medicine
Procedia PDF Downloads 4243196 On-Road Text Detection Platform for Driver Assistance Systems
Authors: Guezouli Larbi, Belkacem Soundes
Abstract:
The automation of the text detection process can help the human in his driving task. Its application can be very useful to help drivers to have more information about their environment by facilitating the reading of road signs such as directional signs, events, stores, etc. In this paper, a system consisting of two stages has been proposed. In the first one, we used pseudo-Zernike moments to pinpoint areas of the image that may contain text. The architecture of this part is based on three main steps, region of interest (ROI) detection, text localization, and non-text region filtering. Then, in the second step, we present a convolutional neural network architecture (On-Road Text Detection Network - ORTDN) which is considered a classification phase. The results show that the proposed framework achieved ≈ 35 fps and an mAP of ≈ 90%, thus a low computational time with competitive accuracy.Keywords: text detection, CNN, PZM, deep learning
Procedia PDF Downloads 873195 Degradation of Mechanical Properties of Offshoring Polymer Composite Pipes in Thermal Environment
Authors: Hamza Benyahia, Mostapha Tarfaoui, Ahmed El-Moumen, Djamel Ouinas
Abstract:
Composite pipes are commonly used in the oil industry, and extreme flow of hot and cold gas fluid can cause degradation of their mechanical performance and properties. Therefore, it is necessary to consider thermomechanical behavior as an important parameter in designing these tubular structures. In this paper, an experimental study is conducted on composite glass/epoxy tubes, with a thickness of 6.2 mm and 86 mm internal diameter made by filament winding of (Փ = ± 55°), to investigate the effects of extreme thermal condition on their mechanical properties b over a temperature range from -40 to 80°C. The climatic chamber is used for the thermal aging and then, combine split disk system is used to perform tensile tests on these composite pies. Thermal aging is carried out for 8hr but each specimen was subjected to various temperature ranges and then, uniaxial tensile test is conducted to evaluate their mechanical performance. Experimental results show degradation in the mechanical properties of composite pipes with an increase in temperature. The rigidity of pipes increases progressively with a decrease in thermal load and results in a radical decrease in their elongation before fracture, thus, decreasing their ductility. However, with an increase in the temperature, there is a decrease in the yield strength and an increase in yield strain, which confirmed an increase in the plasticity of composite pipes.Keywords: composite pipes, thermal-mechanical properties, filament winding, thermal degradation
Procedia PDF Downloads 150