Search results for: computational accuracy
1319 Oral Grammatical Errors of Arabic as Second Language (ASL) Learners: An Applied Linguistic Approach
Authors: Sadeq Al Yaari, Fayza Al Hammadi, Ayman Al Yaari, Adham Al Yaari, Montaha Al Yaari, Aayah Al Yaari, Sajedah Al Yaari, Salah Al Yami
Abstract:
Background: When we further take Arabic grammatical issues into account in accordance with applied linguistic investigations on Arabic as Second Language (ASL) learners, a fundamental issue arises at this point as to the production of speech in Arabic: Oral grammatical errors committed by ASL learners. Aims: Using manual rating as well as computational analytic methodology to test a corpus of recorded speech by Second Language (ASL) learners of Arabic, this study aims to find the areas of difficulties in learning Arabic grammar. More specifically, it examines how and why ASL learners make grammatical errors in their oral speech. Methods: Tape recordings of four (4) Arabic as Second Language (ASL) learners who ranged in age from 23 to 30 were naturally collected. All participants have completed an intensive Arabic program (two years) and 20 minute-speech was recorded for each participant. Having the collected corpus, the next procedure was to rate them against Arabic standard grammar. The rating includes four processes: Description, analysis and assessment. Conclusions: Outcomes made from the issues addressed in this paper can be summarized in the fact that ASL learners face many grammatical difficulties when studying Arabic word order, tenses and aspects, function words, subject-verb agreement, verb form, active-passive voice, global and local errors, processes-based errors including addition, omission, substitution or a combination of any of them.Keywords: grammar, error, oral, Arabic, second language, learner, applied linguistics.
Procedia PDF Downloads 451318 Iris Recognition Based on the Low Order Norms of Gradient Components
Authors: Iman A. Saad, Loay E. George
Abstract:
Iris pattern is an important biological feature of human body; it becomes very hot topic in both research and practical applications. In this paper, an algorithm is proposed for iris recognition and a simple, efficient and fast method is introduced to extract a set of discriminatory features using first order gradient operator applied on grayscale images. The gradient based features are robust, up to certain extents, against the variations may occur in contrast or brightness of iris image samples; the variations are mostly occur due lightening differences and camera changes. At first, the iris region is located, after that it is remapped to a rectangular area of size 360x60 pixels. Also, a new method is proposed for detecting eyelash and eyelid points; it depends on making image statistical analysis, to mark the eyelash and eyelid as a noise points. In order to cover the features localization (variation), the rectangular iris image is partitioned into N overlapped sub-images (blocks); then from each block a set of different average directional gradient densities values is calculated to be used as texture features vector. The applied gradient operators are taken along the horizontal, vertical and diagonal directions. The low order norms of gradient components were used to establish the feature vector. Euclidean distance based classifier was used as a matching metric for determining the degree of similarity between the features vector extracted from the tested iris image and template features vectors stored in the database. Experimental tests were performed using 2639 iris images from CASIA V4-Interival database, the attained recognition accuracy has reached up to 99.92%.Keywords: iris recognition, contrast stretching, gradient features, texture features, Euclidean metric
Procedia PDF Downloads 3351317 Evaluation of Current Methods in Modelling and Analysis of Track with Jointed Rails
Authors: Hossein Askarinejad, Manicka Dhanasekar
Abstract:
In railway tracks, two adjacent rails are either welded or connected using bolted jointbars. In recent years the number of bolted rail joints is reduced by introduction of longer rail sections and by welding the rails at location of some joints. However, significant number of bolted rail joints remains in railways around the world as they are required to allow for rail thermal expansion or to provide electrical insulation in some sections of track. Regardless of the quality and integrity of the jointbar and bolt connections, the bending stiffness of jointbars is much lower than the rail generating large deflections under the train wheels. In addition, the gap or surface discontinuity on the rail running surface leads to generation of high wheel-rail impact force at the joint gap. These fundamental weaknesses have caused high rate of failure in track components at location of rail joints resulting in significant economic and safety issues in railways. The mechanical behavior of railway track at location of joints has not been fully understood due to various structural and material complexities. Although there have been some improvements in the methods for analysis of track at jointed rails in recent years, there are still uncertainties concerning the accuracy and reliability of the current methods. In this paper the current methods in analysis of track with a rail joint are critically evaluated and the new advances and recent research outcomes in this area are discussed. This research is part of a large granted project on rail joints which was defined by Cooperative Research Centre (CRC) for Rail Innovation with supports from Australian Rail Track Corporation (ARTC) and Queensland Rail (QR).Keywords: jointed rails, railway mechanics, track dynamics, wheel-rail interaction
Procedia PDF Downloads 3501316 Recent Progress in Wave Rotor Combustion
Authors: Mohamed Razi Nalim, Shahrzad Ghadiri
Abstract:
With current concerns regarding global warming, demand for a society with greater environmental awareness significantly increases. With gradual development in hybrid and electric vehicles and the availability of renewable energy resources, increasing efficiency in fossil fuel and combustion engines seems a faster solution toward sustainability and reducing greenhouse gas emissions. This paper aims to provide a comprehensive review of recent progress in wave rotor combustor, one of the combustion concepts with considerable potential to improve power output and emission standards. A wave rotor is an oscillatory flow device that uses the unsteady gas dynamic concept to transfer energy by generating pressure waves. From a thermodynamic point of view, unlike conventional positive-displacement piston engines which follow the Brayton cycle, wave rotors offer higher cycle efficiency due to pressure gain during the combustion process based on the Humphrey cycle. First, the paper covers all recent and ongoing computational and experimental studies around the world with a quick look at the milestones in the history of wave rotor development. Second, the main similarity and differences in the ignition system of the wave rotor with piston engines are considered. Also, the comparison is made with another pressure gain device, rotating detonation engines. Next, the main challenges and research needs for wave rotor combustor commercialization are discussed.Keywords: wave rotor combustor, unsteady gas dynamic, pre-chamber jet ignition, pressure gain combustion, constant-volume combustion
Procedia PDF Downloads 841315 Design of Digital IIR Filter Using Opposition Learning and Artificial Bee Colony Algorithm
Authors: J. S. Dhillon, K. K. Dhaliwal
Abstract:
In almost all the digital filtering applications the digital infinite impulse response (IIR) filters are preferred over finite impulse response (FIR) filters because they provide much better performance, less computational cost and have smaller memory requirements for similar magnitude specifications. However, the digital IIR filters are generally multimodal with respect to the filter coefficients and therefore, reliable methods that can provide global optimal solutions are required. The artificial bee colony (ABC) algorithm is one such recently introduced meta-heuristic optimization algorithm. But in some cases it shows insufficiency while searching the solution space resulting in a weak exchange of information and hence is not able to return better solutions. To overcome this deficiency, the opposition based learning strategy is incorporated in ABC and hence a modified version called oppositional artificial bee colony (OABC) algorithm is proposed in this paper. Duplication of members is avoided during the run which also augments the exploration ability. The developed algorithm is then applied for the design of optimal and stable digital IIR filter structure where design of low-pass (LP) and high-pass (HP) filters is carried out. Fuzzy theory is applied to achieve maximize satisfaction of minimum magnitude error and stability constraints. To check the effectiveness of OABC, the results are compared with some well established filter design techniques and it is observed that in most cases OABC returns better or atleast comparable results.Keywords: digital infinite impulse response filter, artificial bee colony optimization, opposition based learning, digital filter design, multi-parameter optimization
Procedia PDF Downloads 4771314 Accelerating Quantum Chemistry Calculations: Machine Learning for Efficient Evaluation of Electron-Repulsion Integrals
Authors: Nishant Rodrigues, Nicole Spanedda, Chilukuri K. Mohan, Arindam Chakraborty
Abstract:
A crucial objective in quantum chemistry is the computation of the energy levels of chemical systems. This task requires electron-repulsion integrals as inputs, and the steep computational cost of evaluating these integrals poses a major numerical challenge in efficient implementation of quantum chemical software. This work presents a moment-based machine-learning approach for the efficient evaluation of electron-repulsion integrals. These integrals were approximated using linear combinations of a small number of moments. Machine learning algorithms were applied to estimate the coefficients in the linear combination. A random forest approach was used to identify promising features using a recursive feature elimination approach, which performed best for learning the sign of each coefficient but not the magnitude. A neural network with two hidden layers were then used to learn the coefficient magnitudes along with an iterative feature masking approach to perform input vector compression, identifying a small subset of orbitals whose coefficients are sufficient for the quantum state energy computation. Finally, a small ensemble of neural networks (with a median rule for decision fusion) was shown to improve results when compared to a single network.Keywords: quantum energy calculations, atomic orbitals, electron-repulsion integrals, ensemble machine learning, random forests, neural networks, feature extraction
Procedia PDF Downloads 1131313 Field Evaluation of Concrete Using Hawaiian Aggregates for Alkali Silica Reaction
Authors: Ian N. Robertson
Abstract:
Alkali Silica Reaction (ASR) occurs in concrete when the alkali hydroxides (Na, K and OH) from the cement react with unstable silica, SiO2, in some types of aggregate. The gel that forms during this reaction will expand when it absorbs water, potentially leading to cracking and overall expansion of the concrete. ASR has resulted in accelerated deterioration of concrete highways, dams and other structures that are exposed to moisture during their service life. Concrete aggregates available in Hawaii have not demonstrated a history of ASR, however, accelerated laboratory tests using ASTM 1260 indicated a potential for ASR with some aggregates. Certain clients are now requiring import of aggregates from the US mainland at great expense. In order to assess the accuracy of the laboratory test results, a long-term field study of the potential for ASR in concretes made with Hawaiian aggregates was initiated in 2011 with funding from the US Federal Highway Administration and Hawaii Department of Transportation. Thirty concrete specimens were constructed of various concrete mixtures using aggregates from all Hawaiian aggregate sources, and some US mainland aggregates known to exhibit ASR expansion. The specimens are located in an open field site in Manoa valley on the Hawaiian Island of Oahu, exposed to relatively high humidity and frequent rainfall. A weather station at the site records the ambient conditions on a continual basis. After two years of monitoring, only one of the Hawaiian aggregates showed any sign of expansion. Ten additional specimens were fabricated with this aggregate to confirm the earlier observations. Admixtures known to mitigate ASR, such as fly ash and lithium, were included in some specimens to evaluate their effect on the concrete expansion. This paper describes the field evaluation program and presents the results for all forty specimens after four years of monitoring.Keywords: aggregate, alkali silica reaction, concrete durability, field exposure
Procedia PDF Downloads 2471312 Digitizing Masterpieces in Italian Museums: Techniques, Challenges and Consequences from Giotto to Caravaggio
Authors: Ginevra Addis
Abstract:
The possibility of reproducing physical artifacts in a digital format is one of the opportunities offered by the technological advancements in information and communication most frequently promoted by museums. Indeed, the study and conservation of our cultural heritage have seen significant advancement due to the three-dimensional acquisition and modeling technology. A variety of laser scanning systems has been developed, based either on optical triangulation or on time-of-flight measurement, capable of producing digital 3D images of complex structures with high resolution and accuracy. It is necessary, however, to explore the challenges and opportunities that this practice brings within museums. The purpose of this paper is to understand what change is introduced by digital techniques in those museums that are hosting digital masterpieces. The methodology used will investigate three distinguished Italian exhibitions, related to the territory of Milan, trying to analyze the following issues about museum practices: 1) how digitizing art masterpieces increases the number of visitors; 2) what the need that calls for the digitization of artworks; 3) which techniques are most used; 4) what the setting is; 5) the consequences of a non-publication of hard copies of catalogues; 6) envision of these practices in the future. Findings will show how interconnection plays an important role in rebuilding a collection spread all over the world. Secondly how digital artwork duplication and extension of reality entail new forms of accessibility. Thirdly, that collection and preservation through digitization of images have both a social and educational mission. Fourthly, that convergence of the properties of different media (such as web, radio) is key to encourage people to get actively involved in digital exhibitions. The present analysis will suggest further research that should create museum models and interaction spaces that act as catalysts for innovation.Keywords: digital masterpieces, education, interconnection, Italian museums, preservation
Procedia PDF Downloads 1751311 Hybrid Thresholding Lifting Dual Tree Complex Wavelet Transform with Wiener Filter for Quality Assurance of Medical Image
Authors: Hilal Naimi, Amelbahahouda Adamou-Mitiche, Lahcene Mitiche
Abstract:
The main problem in the area of medical imaging has been image denoising. The most defying for image denoising is to secure data carrying structures like surfaces and edges in order to achieve good visual quality. Different algorithms with different denoising performances have been proposed in previous decades. More recently, models focused on deep learning have shown a great promise to outperform all traditional approaches. However, these techniques are limited to the necessity of large sample size training and high computational costs. This research proposes a denoising approach basing on LDTCWT (Lifting Dual Tree Complex Wavelet Transform) using Hybrid Thresholding with Wiener filter to enhance the quality image. This research describes the LDTCWT as a type of lifting wavelets remodeling that produce complex coefficients by employing a dual tree of lifting wavelets filters to get its real part and imaginary part. Permits the remodel to produce approximate shift invariance, directionally selective filters and reduces the computation time (properties lacking within the classical wavelets transform). To develop this approach, a hybrid thresholding function is modeled by integrating the Wiener filter into the thresholding function.Keywords: lifting wavelet transform, image denoising, dual tree complex wavelet transform, wavelet shrinkage, wiener filter
Procedia PDF Downloads 1631310 Climate Changes in Albania and Their Effect on Cereal Yield
Authors: Lule Basha, Eralda Gjika
Abstract:
This study is focused on analyzing climate change in Albania and its potential effects on cereal yields. Initially, monthly temperature and rainfalls in Albania were studied for the period 1960-2021. Climacteric variables are important variables when trying to model cereal yield behavior, especially when significant changes in weather conditions are observed. For this purpose, in the second part of the study, linear and nonlinear models explaining cereal yield are constructed for the same period, 1960-2021. The multiple linear regression analysis and lasso regression method are applied to the data between cereal yield and each independent variable: average temperature, average rainfall, fertilizer consumption, arable land, land under cereal production, and nitrous oxide emissions. In our regression model, heteroscedasticity is not observed, data follow a normal distribution, and there is a low correlation between factors, so we do not have the problem of multicollinearity. Machine-learning methods, such as random forest, are used to predict cereal yield responses to climacteric and other variables. Random Forest showed high accuracy compared to the other statistical models in the prediction of cereal yield. We found that changes in average temperature negatively affect cereal yield. The coefficients of fertilizer consumption, arable land, and land under cereal production are positively affecting production. Our results show that the Random Forest method is an effective and versatile machine-learning method for cereal yield prediction compared to the other two methods.Keywords: cereal yield, climate change, machine learning, multiple regression model, random forest
Procedia PDF Downloads 911309 Drape Simulation by Commercial Software and Subjective Assessment of Virtual Drape
Authors: Evrim Buyukaslan, Simona Jevsnik, Fatma Kalaoglu
Abstract:
Simulation of fabrics is more difficult than any other simulation due to complex mechanics of fabrics. Most of the virtual garment simulation software use mass-spring model and incorporate fabric mechanics into simulation models. The accuracy and fidelity of these virtual garment simulation software is a question mark. Drape is a subjective phenomenon and evaluation of drape has been studied since 1950’s. On the other hand, fabric and garment simulation is relatively new. Understanding drape perception of subjects when looking at fabric simulations is critical as virtual try-on becomes more of an issue by enhanced online apparel sales. Projected future of online apparel retailing is that users may view their avatars and try-on the garment on their avatars in the virtual environment. It is a well-known fact that users will not be eager to accept this innovative technology unless it is realistic enough. Therefore, it is essential to understand what users see when they are displaying fabrics in a virtual environment. Are they able to distinguish the differences between various fabrics in virtual environment? The purpose of this study is to investigate human perception when looking at a virtual fabric and determine the most visually noticeable drape parameter. To this end, five different fabrics are mechanically tested, and their drape simulations are generated by commercial garment simulation software (Optitex®). The simulation images are processed by an image analysis software to calculate drape parameters namely; drape coefficient, node severity, and peak angles. A questionnaire is developed to evaluate drape properties subjectively in a virtual environment. Drape simulation images are shown to 27 subjects and asked to rank the samples according to their questioned drape property. The answers are compared to the calculated drape parameters. The results show that subjects are quite sensitive to drape coefficient changes while they are not very sensitive to changes in node dimensions and node distributions.Keywords: drape simulation, drape evaluation, fabric mechanics, virtual fabric
Procedia PDF Downloads 3381308 Aerodynamic Heating and Drag Reduction of Pegasus-XL Satellite Launch Vehicle
Authors: Syed Muhammad Awais Tahir, Syed Hossein Raza Hamdani
Abstract:
In the last two years, there has been a substantial increase in the rate of satellite launches. To keep up with the technology, it is imperative that the launch cost must be made affordable, especially in developing and underdeveloped countries. Launch cost is directly affected by the launch vehicle’s aerodynamic performance. Pegasus-XL SLV (Satellite Launch Vehicle) has been serving as a commercial SLV for the last 26 years, commencing its commercial flight operation from the six operational sites all around the US and Europe, and the Marshal Islands. Aerodynamic heating and drag contribute largely to Pegasus’s flight performance. The objective of this study is to reduce the aerodynamic heating and drag on Pegasus’s body significantly for supersonic and hypersonic flight regimes. Aerodynamic data for Pegasus’s first flight has been validated through CFD (Computational Fluid Dynamics), and then drag and aerodynamic heating is reduced by using a combination of a forward-facing cylindrical spike and a conical aero-disk at the actual operational flight conditions. CFD analysis using ANSYS fluent will be carried out for Mach no. ranges from 0.83 to 7.8, and AoA (Angle of Attack) ranges from -4 to +24 degrees for both simple and spiked-configuration, and then the comparison will be drawn using a variety of graphs and contours. Expected drag reduction for supersonic flight is to be around 15% to 25%, and for hypersonic flight is to be around 30% to 50%, especially for AoA < 15⁰. A 5% to 10% reduction in aerodynamic heating is expected to be achieved for hypersonic regions. In conclusion, the aerodynamic performance of air-launched Pegasus-XL SLV can be further enhanced, leading to its optimal fuel usage to achieve a more economical orbital flight.Keywords: aerodynamics, pegasus-XL, drag reduction, aerodynamic heating, satellite launch vehicle, SLV, spike, aero-disk
Procedia PDF Downloads 1051307 Variance-Aware Routing and Authentication Scheme for Harvesting Data in Cloud-Centric Wireless Sensor Networks
Authors: Olakanmi Oladayo Olufemi, Bamifewe Olusegun James, Badmus Yaya Opeyemi, Adegoke Kayode
Abstract:
The wireless sensor network (WSN) has made a significant contribution to the emergence of various intelligent services or cloud-based applications. Most of the time, these data are stored on a cloud platform for efficient management and sharing among different services or users. However, the sensitivity of the data makes them prone to various confidentiality and performance-related attacks during and after harvesting. Various security schemes have been developed to ensure the integrity and confidentiality of the WSNs' data. However, their specificity towards particular attacks and the resource constraint and heterogeneity of WSNs make most of these schemes imperfect. In this paper, we propose a secure variance-aware routing and authentication scheme with two-tier verification to collect, share, and manage WSN data. The scheme is capable of classifying WSN into different subnets, detecting any attempt of wormhole and black hole attack during harvesting, and enforcing access control on the harvested data stored in the cloud. The results of the analysis showed that the proposed scheme has more security functionalities than other related schemes, solves most of the WSNs and cloud security issues, prevents wormhole and black hole attacks, identifies the attackers during data harvesting, and enforces access control on the harvested data stored in the cloud at low computational, storage, and communication overheads.Keywords: data block, heterogeneous IoT network, data harvesting, wormhole attack, blackhole attack access control
Procedia PDF Downloads 841306 Factors Affecting the Caregiving Experience of Children with Parental Mental Illnesses: A Systematic Review
Authors: N. Anjana
Abstract:
Worldwide, the prevalence of mental illnesses is increasing. The issues of persons with mental illness and their caregivers have been well documented in the literature. However, data regarding the factors affecting the caregiving experience of children with parental mental illnesses is sparse. This systematic review aimed to examine the existing literature of the factors affecting the caregiving experience of children of parents with mental illnesses. A comprehensive search of databases such as PubMed, EBSCO, JSTOR, ProQuest Central, Taylor and Francis Online, and Google Scholar were performed to identify peer-reviewed papers examining the factors associated with caregiving experiences of children with parental mental illnesses such as schizophrenia and major depression, for the 10-year period ending November 2019. Two researchers screened studies for eligibility. One researcher extracted data from eligible studies while a second performed verification of results for accuracy and completeness. Quality appraisal was conducted by both reviewers. Data describing major factors associated with caregiving experiences of children with parental mental illnesses were synthesized and reported in narrative form. Five studies were considered eligible and included in this review. Findings are organized under major themes such as the impact of parental mental illness on children’s daily life, how children provide care to their mentally ill parents as primary carers, social and relationship factors associated with their caregiving, positive and negative experiences in caregiving and how children cope with their experiences with parental mental illnesses. Literature relating to the caregiving experiences of children with parental mental illnesses is sparse. More research is required to better understand the children’s caregiving experiences related to parental mental illnesses so as to better inform management for enhancing their mental health, wellbeing, and caregiving practice.Keywords: caregiving experience, children, parental mental illnesses, wellbeing
Procedia PDF Downloads 1411305 Effect of Threshold Configuration on Accuracy in Upper Airway Analysis Using Cone Beam Computed Tomography
Authors: Saba Fahham, Supak Ngamsom, Suchaya Damrongsri
Abstract:
Objective: The objective is to determine the optimal threshold of Romexis software for the airway volume and minimum cross-section area (MCA) analysis using Image J as a gold standard. Materials and Methods: A total of ten cone-beam computed tomography (CBCT) images were collected. The airway volume and MCA of each patient were analyzed using the automatic airway segmentation function in the CBCT DICOM viewer (Romexis). Airway volume and MCA measurements were conducted on each CBCT sagittal view with fifteen different threshold values from the Romexis software, Ranging from 300 to 1000. Duplicate DICOM files, in axial view, were imported into Image J for concurrent airway volume and MCA analysis as the gold standard. The airway volume and MCA measured from Romexis and Image J were compared using a t-test with Bonferroni correction, and statistical significance was set at p<0.003. Results: Concerning airway volume, thresholds of 600 to 850 as well as 1000, exhibited results that were not significantly distinct from those obtained through Image J. Regarding MCA, employing thresholds from 400 to 850 within Romexis Viewer showed no variance from Image J. Notably, within the threshold range of 600 to 850, there were no statistically significant differences observed in both airway volume and MCA analyses, in comparison to Image J. Conclusion: This study demonstrated that the utilization of Planmeca Romexis Viewer 6.4.3.3 within threshold range of 600 to 850 yields airway volume and MCA measurements that exhibit no statistically significant variance in comparison to measurements obtained through Image J. This outcome holds implications for diagnosing upper airway obstructions and post-orthodontic surgical monitoring.Keywords: airway analysis, airway segmentation, cone beam computed tomography, threshold
Procedia PDF Downloads 441304 Microstructure Evolution and Modelling of Shear Forming
Authors: Karla D. Vazquez-Valdez, Bradley P. Wynne
Abstract:
In the last decades manufacturing needs have been changing, leading to the study of manufacturing methods that were underdeveloped, such as incremental forming processes like shear forming. These processes use rotating tools in constant local contact with the workpiece, which is often also rotating, to generate shape. This means much lower loads to forge large parts and no need for expensive special tooling. Potential has already been established by demonstrating manufacture of high-value products, e.g., turbine and satellite parts, with high dimensional accuracy from difficult to manufacture materials. Thus, huge opportunities exist for these processes to replace the current method of manufacture for a range of high value components, e.g., eliminating lengthy machining, reducing material waste and process times; or the manufacture of a complicated shape without the development of expensive tooling. However, little is known about the exact deformation conditions during processing and why certain materials are better than others for shear forming, leading to a lot of trial and error before production. Three alloys were used for this study: Ti-54M, Jethete M154, and IN718. General Microscopy and Electron Backscatter Diffraction (EBSD) were used to measure strains and orientation maps during shear forming. A Design of Experiments (DOE) analysis was also made in order to understand the impact of process parameters in the properties of the final workpieces. Such information was the key to develop a reliable Finite Element Method (FEM) model that closely resembles the deformation paths of this process. Finally, the potential of these three materials to be shear spun was studied using the FEM model and their Forming Limit Diagram (FLD) which led to the development of a rough methodology for testing the shear spinnability of various metals.Keywords: shear forming, damage, principal strains, forming limit diagram
Procedia PDF Downloads 1631303 Theoretical-Experimental Investigations on Free Vibration of Glass Fiber/Polyester Composite Conical Shells Containing Fluid
Authors: Tran Ich Thinh, Nguyen Manh Cuong
Abstract:
Free vibrations of partial fluid-filled composite truncated conical shells are investigated using the Dynamic Stiffness Method (DSM) or Continuous Element Method (CEM) based on the First Order Shear Deformation Theory (FSDT) and non-viscous incompressible fluid equations. Numerical examples are given for analyzing natural frequencies and harmonic responses of clamped-free conical shells partially and completely filled with fluid. To compare with the theoretical results, detailed experimental results have been obtained on the free vibration of a clamped-free conical shells partially filled with water by using a multi-vibration measuring machine (DEWEBOOK-DASYLab 5.61.10). Three glass fiber/polyester composite truncated cones with the radius of the larger end 285 mm, thickness 2 mm, and the cone lengths along the generators are 285 mm, 427.5 mm and 570 mm with the semi-vertex angles 27, 14 and 9 degrees respectively were used, and the filling ratio of the contained water was 0, 0.25, 0.50, 0.75 and 1.0. The results calculated by proposed computational model for studied composite conical shells are in good agreement with experiments. Obtained results indicate that the fluid filling can reduce significantly the natural frequencies of composite conical shells. Parametric studies including circumferential wave number, fluid depth and cone angles are carried out.Keywords: dynamic stiffness method, experimental study, free vibration, fluid-shell interaction, glass fiber/polyester composite conical shell
Procedia PDF Downloads 4981302 Study on Energy Transfer in Collapsible Soil During Laboratory Proctor Compaction Test
Authors: Amritanshu Sandilya, M. V. Shah
Abstract:
Collapsible soils such as loess are a common geotechnical challenge due to their potential to undergo sudden and severe settlement under certain loading conditions. The need for filling engineering to increase developing land has grown significantly in recent years, which has created several difficulties in managing soil strength and stability during compaction. Numerous engineering problems, such as roadbed subsidence and pavement cracking, have been brought about by insufficient fill strength. Therefore, strict control of compaction parameters is essential to reduce these distresses. Accurately measuring the degree of compaction, which is often represented by compactness is an important component of compaction control. For credible predictions of how collapsible soils will behave under complicated loading situations, the accuracy of laboratory studies is essential. Therefore, this study aims to investigate the energy transfer in collapsible soils during laboratory Proctor compaction tests to provide insights into how energy transfer can be optimized to achieve more accurate and reliable results in compaction testing. The compaction characteristics in terms of energy of loess soil have been studied at moisture content corresponding to dry of optimum, at the optimum and wet side of optimum and at different compaction energy levels. The hammer impact force (E0) and soil bottom force (E) were measured using an impact load cell mounted at the bottom of the compaction mould. The variation in energy consumption ratio (E/ E0) was observed and compared with the compaction curve of the soil. The results indicate that the plot of energy consumption ratio versus moisture content can serve as a reliable indicator of the compaction characteristics of the soil in terms of energy.Keywords: soil compaction, proctor compaction test, collapsible soil, energy transfer
Procedia PDF Downloads 921301 Exploring the Spatial Relationship between Built Environment and Ride-hailing Demand: Applying Street-Level Images
Authors: Jingjue Bao, Ye Li, Yujie Qi
Abstract:
The explosive growth of ride-hailing has reshaped residents' travel behavior and plays a crucial role in urban mobility within the built environment. Contributing to the research of the spatial variation of ride-hailing demand and its relationship to the built environment and socioeconomic factors, this study utilizes multi-source data from Haikou, China, to construct a Multi-scale Geographically Weighted Regression model (MGWR), considering spatial scale heterogeneity. The regression results showed that MGWR model was demonstrated superior interpretability and reliability with an improvement of 3.4% on R2 and from 4853 to 4787 on AIC, compared with Geographically Weighted Regression model (GWR). Furthermore, to precisely identify the surrounding environment of sampling point, DeepLabv3+ model is employed to segment street-level images. Features extracted from these images are incorporated as variables in the regression model, further enhancing its rationality and accuracy by 7.78% improvement on R2 compared with the MGWR model only considered region-level variables. By integrating multi-scale geospatial data and utilizing advanced computer vision techniques, this study provides a comprehensive understanding of the spatial dynamics between ride-hailing demand and the urban built environment. The insights gained from this research are expected to contribute significantly to urban transportation planning and policy making, as well as ride-hailing platforms, facilitating the development of more efficient and effective mobility solutions in modern cities.Keywords: travel behavior, ride-hailing, spatial relationship, built environment, street-level image
Procedia PDF Downloads 811300 Enhancing Project Performance Forecasting using Machine Learning Techniques
Authors: Soheila Sadeghi
Abstract:
Accurate forecasting of project performance metrics is crucial for successfully managing and delivering urban road reconstruction projects. Traditional methods often rely on static baseline plans and fail to consider the dynamic nature of project progress and external factors. This research proposes a machine learning-based approach to forecast project performance metrics, such as cost variance and earned value, for each Work Breakdown Structure (WBS) category in an urban road reconstruction project. The proposed model utilizes time series forecasting techniques, including Autoregressive Integrated Moving Average (ARIMA) and Long Short-Term Memory (LSTM) networks, to predict future performance based on historical data and project progress. The model also incorporates external factors, such as weather patterns and resource availability, as features to enhance the accuracy of forecasts. By applying the predictive power of machine learning, the performance forecasting model enables proactive identification of potential deviations from the baseline plan, which allows project managers to take timely corrective actions. The research aims to validate the effectiveness of the proposed approach using a case study of an urban road reconstruction project, comparing the model's forecasts with actual project performance data. The findings of this research contribute to the advancement of project management practices in the construction industry, offering a data-driven solution for improving project performance monitoring and control.Keywords: project performance forecasting, machine learning, time series forecasting, cost variance, earned value management
Procedia PDF Downloads 491299 Development of Medical Intelligent Process Model Using Ontology Based Technique
Authors: Emmanuel Chibuogu Asogwa, Tochukwu Sunday Belonwu
Abstract:
An urgent demand for creative solutions has been created by the rapid expansion of medical knowledge, the complexity of patient care, and the requirement for more precise decision-making. As a solution to this problem, the creation of a Medical Intelligent Process Model (MIPM) utilizing ontology-based appears as a promising way to overcome this obstacle and unleash the full potential of healthcare systems. The development of a Medical Intelligent Process Model (MIPM) using ontology-based techniques is motivated by a lack of quick access to relevant medical information and advanced tools for treatment planning and clinical decision-making, which ontology-based techniques can provide. The aim of this work is to develop a structured and knowledge-driven framework that leverages ontology, a formal representation of domain knowledge, to enhance various aspects of healthcare. Object-Oriented Analysis and Design Methodology (OOADM) were adopted in the design of the system as we desired to build a usable and evolvable application. For effective implementation of this work, we used the following materials/methods/tools: the medical dataset for the test of our model in this work was obtained from Kaggle. The ontology-based technique was used with Confusion Matrix, MySQL, Python, Hypertext Markup Language (HTML), Hypertext Preprocessor (PHP), Cascaded Style Sheet (CSS), JavaScript, Dreamweaver, and Fireworks. According to test results on the new system using Confusion Matrix, both the accuracy and overall effectiveness of the medical intelligent process significantly improved by 20% compared to the previous system. Therefore, using the model is recommended for healthcare professionals.Keywords: ontology-based, model, database, OOADM, healthcare
Procedia PDF Downloads 781298 Modelling Heat Transfer Characteristics in the Pasteurization Process of Medium Long Necked Bottled Beers
Authors: S. K. Fasogbon, O. E. Oguegbu
Abstract:
Pasteurization is one of the most important steps in the preservation of beer products, which improves its shelf life by inactivating almost all the spoilage organisms present in it. However, there is no gain saying the fact that it is always difficult to determine the slowest heating zone, the temperature profile and pasteurization units inside bottled beer during pasteurization, hence there had been significant experimental and ANSYS fluent approaches on the problem. This work now developed Computational fluid dynamics model using COMSOL Multiphysics. The model was simulated to determine the slowest heating zone, temperature profile and pasteurization units inside the bottled beer during the pasteurization process. The results of the simulation were compared with the existing data in the literature. The results showed that, the location and size of the slowest heating zone is dependent on the time-temperature combination of each zone. The results also showed that the temperature profile of the bottled beer was found to be affected by the natural convection resulting from variation in density during pasteurization process and that the pasteurization unit increases with time subject to the temperature reached by the beer. Although the results of this work agreed with literatures in the aspects of slowest heating zone and temperature profiles, the results of pasteurization unit however did not agree. It was suspected that this must have been greatly affected by the bottle geometry, specific heat capacity and density of the beer in question. The work concludes that for effective pasteurization to be achieved, there is a need to optimize the spray water temperature and the time spent by the bottled product in each of the pasteurization zones.Keywords: modeling, heat transfer, temperature profile, pasteurization process, bottled beer
Procedia PDF Downloads 2031297 Remote Sensing and GIS-Based Environmental Monitoring by Extracting Land Surface Temperature of Abbottabad, Pakistan
Authors: Malik Abid Hussain Khokhar, Muhammad Adnan Tahir, Hisham Bin Hafeez Awan
Abstract:
Continuous environmental determinism and climatic change in the entire globe due to increasing land surface temperature (LST) has become a vital phenomenon nowadays. LST is accelerating because of increasing greenhouse gases in the environment which results of melting down ice caps, ice sheets and glaciers. It has not only worse effects on vegetation and water bodies of the region but has also severe impacts on monsoon areas in the form of capricious rainfall and monsoon failure extensive precipitation. Environment can be monitored with the help of various geographic information systems (GIS) based algorithms i.e. SC (Single), DA (Dual Angle), Mao, Sobrino and SW (Split Window). Estimation of LST is very much possible from digital image processing of satellite imagery. This paper will encompass extraction of LST of Abbottabad using SW technique of GIS and Remote Sensing over last ten years by means of Landsat 7 ETM+ (Environmental Thematic Mapper) and Landsat 8 vide their Thermal Infrared (TIR Sensor) and Optical Land Imager (OLI sensor less Landsat 7 ETM+) having 100 m TIR resolution and 30 m Spectral Resolutions. These sensors have two TIR bands each; their emissivity and spectral radiance will be used as input statistics in SW algorithm for LST extraction. Emissivity will be derived from Normalized Difference Vegetation Index (NDVI) threshold methods using 2-5 bands of OLI with the help of e-cognition software, and spectral radiance will be extracted TIR Bands (Band 10-11 and Band 6 of Landsat 7 ETM+). Accuracy of results will be evaluated by weather data as well. The successive research will have a significant role for all tires of governing bodies related to climate change departments.Keywords: environment, Landsat 8, SW Algorithm, TIR
Procedia PDF Downloads 3551296 Price Effect Estimation of Tobacco on Low-wage Male Smokers: A Causal Mediation Analysis
Authors: Kawsar Ahmed, Hong Wang
Abstract:
The study's goal was to estimate the causal mediation impact of tobacco tax before and after price hikes among low-income male smokers, with a particular emphasis on the effect estimating pathways framework for continuous and dichotomous variables. From July to December 2021, a cross-sectional investigation of observational data (n=739) was collected from Bangladeshi low-wage smokers. The Quasi-Bayesian technique, binomial probit model, and sensitivity analysis using a simulation of the computational tools R mediation package had been used to estimate the effect. After a price rise for tobacco products, the average number of cigarettes or bidis sticks taken decreased from 6.7 to 4.56. Tobacco product rising prices have a direct effect on low-income people's decisions to quit or lessen their daily smoking habits of Average Causal Mediation Effect (ACME) [effect=2.31, 95 % confidence interval (C.I.) = (4.71-0.00), p<0.01], Average Direct Effect (ADE) [effect=8.6, 95 percent (C.I.) = (6.8-0.11), p<0.001], and overall significant effects (p<0.001). Tobacco smoking choice is described by the mediated proportion of income effect, which is 26.1% less of following price rise. The curve of ACME and ADE is based on observational figures of the coefficients of determination that asses the model of hypothesis as the substantial consequence after price rises in the sensitivity analysis. To reduce smoking product behaviors, price increases through taxation have a positive causal mediation with income that affects the decision to limit tobacco use and promote low-income men's healthcare policy.Keywords: causal mediation analysis, directed acyclic graphs, tobacco price policy, sensitivity analysis, pathway estimation
Procedia PDF Downloads 1121295 Probabilistic Crash Prediction and Prevention of Vehicle Crash
Authors: Lavanya Annadi, Fahimeh Jafari
Abstract:
Transportation brings immense benefits to society, but it also has its costs. Costs include such as the cost of infrastructure, personnel and equipment, but also the loss of life and property in traffic accidents on the road, delays in travel due to traffic congestion and various indirect costs in terms of air transport. More research has been done to identify the various factors that affect road accidents, such as road infrastructure, traffic, sociodemographic characteristics, land use, and the environment. The aim of this research is to predict the probabilistic crash prediction of vehicles using machine learning due to natural and structural reasons by excluding spontaneous reasons like overspeeding etc., in the United States. These factors range from weather factors, like weather conditions, precipitation, visibility, wind speed, wind direction, temperature, pressure, and humidity to human made structures like road structure factors like bump, roundabout, no exit, turning loop, give away, etc. Probabilities are dissected into ten different classes. All the predictions are based on multiclass classification techniques, which are supervised learning. This study considers all crashes that happened in all states collected by the US government. To calculate the probability, multinomial expected value was used and assigned a classification label as the crash probability. We applied three different classification models, including multiclass Logistic Regression, Random Forest and XGBoost. The numerical results show that XGBoost achieved a 75.2% accuracy rate which indicates the part that is being played by natural and structural reasons for the crash. The paper has provided in-deep insights through exploratory data analysis.Keywords: road safety, crash prediction, exploratory analysis, machine learning
Procedia PDF Downloads 1111294 Systematic Discovery of Bacterial Toxins Against Plants Pathogens Fungi
Authors: Yaara Oppenheimer-Shaanan, Nimrod Nachmias, Marina Campos Rocha, Neta Schlezinger, Noam Dotan, Asaf Levy
Abstract:
Fusarium oxysporum, a fungus that attacks a broad range of plants and can cause infections in humans, operates across different kingdoms. This pathogen encounters varied conditions, such as temperature, pH, and nutrient availability, in plant and human hosts. The Fusarium oxysporum species complex, pervasive in soils globally, can affect numerous plants, including key crops like tomatoes and bananas. Controlling Fusarium infections can involve biocontrol agents that hinder the growth of harmful strains. Our research developed a computational method to identify toxin domains within a vast number of microbial genomes, leading to the discovery of nine distinct toxins capable of killing bacteria and fungi, including Fusarium. These toxins appear to function as enzymes, causing significant damage to cellular structures, membranes and DNA. We explored biological control using bacteria that produce polymorphic toxins, finding that certain bacteria, non-pathogenic to plants, offer a safe biological alternative for Fusarium management, as they did not harm macrophage cells or C. elegans. Additionally, we elucidated the 3D structures of two toxins with their protective immunity proteins, revealing their function as unique DNases. These potent toxins are likely instrumental in microbial competition within plant ecosystems and could serve as biocontrol agents to mitigate Fusarium wilt and related diseases.Keywords: microbial toxins, antifungal, Fusarium oxysporum, bacterial-fungal intreactions
Procedia PDF Downloads 561293 Risk Issues for Controlling Floods through Unsafe, Dual Purpose, Gated Dams
Authors: Gregory Michael McMahon
Abstract:
Risk management for the purposes of minimizing the damages from the operations of dams has met with opposition emerging from organisations and authorities, and their practitioners. It appears that the cause may be a misunderstanding of risk management arising from exchanges that mix deterministic thinking with risk-centric thinking and that do not separate uncertainty from reliability and accuracy from probability. This paper sets out those misunderstandings that arose from dam operations at Wivenhoe in 2011, using a comparison of outcomes that have been based on the methodology and its rules and those that have been operated by applying misunderstandings of the rules. The paper addresses the performance of one risk-centric Flood Manual for Wivenhoe Dam in achieving a risk management outcome. A mixture of engineering, administrative, and legal factors appear to have combined to reduce the outcomes from the risk approach. These are described. The findings are that a risk-centric Manual may need to assist administrations in the conduct of scenario training regimes, in responding to healthy audit reporting, and in the development of decision-support systems. The principal assistance needed from the Manual, however, is to assist engineering and the law to a good understanding of how risks are managed – do not assume that risk management is understood. The wider findings are that the critical profession for decision-making downstream of the meteorologist is not dam engineering or hydrology, or hydraulics; it is risk management. Risk management will provide the minimum flood damage outcome where actual rainfalls match or exceed forecasts of rainfalls, that therefore risk management will provide the best approach for the likely history of flooding in the life of a dam, and provisions made for worst cases may be state of the art in risk management. The principal conclusion is the need for training in both risk management as a discipline and also in the application of risk management rules to particular dam operational scenarios.Keywords: risk management, flood control, dam operations, deterministic thinking
Procedia PDF Downloads 871292 Numerical Simulation of Large-Scale Landslide-Generated Impulse Waves With a Soil‒Water Coupling Smooth Particle Hydrodynamics Model
Authors: Can Huang, Xiaoliang Wang, Qingquan Liu
Abstract:
Soil‒water coupling is an important process in landslide-generated impulse waves (LGIW) problems, accompanied by large deformation of soil, strong interface coupling and three-dimensional effect. A meshless particle method, smooth particle hydrodynamics (SPH) has great advantages in dealing with complex interface and multiphase coupling problems. This study presents an improved soil‒water coupled model to simulate LGIW problems based on an open source code DualSPHysics (v4.0). Aiming to solve the low efficiency problem in modeling real large-scale LGIW problems, graphics processing unit (GPU) acceleration technology is implemented into this code. An experimental example, subaerial landslide-generated water waves, is simulated to demonstrate the accuracy of this model. Then, the Huangtian LGIW, a real large-scale LGIW problem is modeled to reproduce the entire disaster chain, including landslide dynamics, fluid‒solid interaction, and surge wave generation. The convergence analysis shows that a particle distance of 5.0 m can provide a converged landslide deposit and surge wave for this example. Numerical simulation results are in good agreement with the limited field survey data. The application example of the Huangtian LGIW provides a typical reference for large-scale LGIW assessments, which can provide reliable information on landslide dynamics, interface coupling behavior, and surge wave characteristics.Keywords: soil‒water coupling, landslide-generated impulse wave, large-scale, SPH
Procedia PDF Downloads 641291 Innovative Predictive Modeling and Characterization of Composite Material Properties Using Machine Learning and Genetic Algorithms
Authors: Hamdi Beji, Toufik Kanit, Tanguy Messager
Abstract:
This study aims to construct a predictive model proficient in foreseeing the linear elastic and thermal characteristics of composite materials, drawing on a multitude of influencing parameters. These parameters encompass the shape of inclusions (circular, elliptical, square, triangle), their spatial coordinates within the matrix, orientation, volume fraction (ranging from 0.05 to 0.4), and variations in contrast (spanning from 10 to 200). A variety of machine learning techniques are deployed, including decision trees, random forests, support vector machines, k-nearest neighbors, and an artificial neural network (ANN), to facilitate this predictive model. Moreover, this research goes beyond the predictive aspect by delving into an inverse analysis using genetic algorithms. The intent is to unveil the intrinsic characteristics of composite materials by evaluating their thermomechanical responses. The foundation of this research lies in the establishment of a comprehensive database that accounts for the array of input parameters mentioned earlier. This database, enriched with this diversity of input variables, serves as a bedrock for the creation of machine learning and genetic algorithm-based models. These models are meticulously trained to not only predict but also elucidate the mechanical and thermal conduct of composite materials. Remarkably, the coupling of machine learning and genetic algorithms has proven highly effective, yielding predictions with remarkable accuracy, boasting scores ranging between 0.97 and 0.99. This achievement marks a significant breakthrough, demonstrating the potential of this innovative approach in the field of materials engineering.Keywords: machine learning, composite materials, genetic algorithms, mechanical and thermal proprieties
Procedia PDF Downloads 541290 The Effects of Peer Education on Condom Use Intentions: A Comprehensive Sex Education Quality Improvement Project
Authors: Janell Jayamohan
Abstract:
A pilot project based on the Theory of Planned Behavior was completed at a single sex female international high school in order to improve the quality of comprehensive sex education in a 12th grade classroom. The student sample is representative of a growing phenomenon of “Third Culture Kids” or global nomads; often in today’s world, culture transcends any one dominant influence and blends values from multiple sources. The Objective was to improve intentions of condom use during the students’ first or next intercourse. A peer-education session which focused on condom attitudes, social norms, and self-efficacy - central tenets of the Theory of Planned Behavior - was added to an existing curriculum in order to achieve this objective. Peer educators were given liberty of creating and executing the lesson to their homeroom, a sample of 23 senior students, with minimal intervention from faculty, the desired outcome being that the students themselves would be the best judge of what is culturally relevant and important to their peers. The school nurse and school counselor acted as faculty facilitators but did not assist in the creation or delivery of the lesson, only checked for medical accuracy. The participating sample of students completed a pre and post-test with validated questions assessing changes in attitudes and overall satisfaction with the peer education lesson. As this intervention was completed during the Covid-19 pandemic, the peer education session was completed in a virtual classroom environment, limiting the modes of information delivery available to the peer educators, but is planned to be replicated in an in-person environment in subsequent cycles.Keywords: adolescents, condoms, peer education, sex education, theory of planned behavior, third culture kids
Procedia PDF Downloads 129