Search results for: tube-based robust MPC
493 Assimilating Remote Sensing Data Into Crop Models: A Global Systematic Review
Authors: Luleka Dlamini, Olivier Crespo, Jos van Dam
Abstract:
Accurately estimating crop growth and yield is pivotal for timely sustainable agricultural management and ensuring food security. Crop models and remote sensing can complement each other and form a robust analysis tool to improve crop growth and yield estimations when combined. This study thus aims to systematically evaluate how research that exclusively focuses on assimilating RS data into crop models varies among countries, crops, data assimilation methods, and farming conditions. A strict search string was applied in the Scopus and Web of Science databases, and 497 potential publications were obtained. After screening for relevance with predefined inclusion/exclusion criteria, 123 publications were considered in the final review. Results indicate that over 81% of the studies were conducted in countries associated with high socio-economic and technological advancement, mainly China, the United States of America, France, Germany, and Italy. Many of these studies integrated MODIS or Landsat data into WOFOST to improve crop growth and yield estimation of staple crops at the field and regional scales. Most studies use recalibration or updating methods alongside various algorithms to assimilate remotely sensed leaf area index into crop models. However, these methods cannot account for the uncertainties in remote sensing observations and the crop model itself. l. Over 85% of the studies were based on commercial and irrigated farming systems. Despite a great global interest in data assimilation into crop models, limited research has been conducted in resource- and data-limited regions like Africa. We foresee a great potential for such application in those conditions. Hence facilitating and expanding the use of such an approach, from which developing farming communities could benefit.Keywords: crop models, remote sensing, data assimilation, crop yield estimation
Procedia PDF Downloads 131492 Assimilating Remote Sensing Data into Crop Models: A Global Systematic Review
Authors: Luleka Dlamini, Olivier Crespo, Jos van Dam
Abstract:
Accurately estimating crop growth and yield is pivotal for timely sustainable agricultural management and ensuring food security. Crop models and remote sensing can complement each other and form a robust analysis tool to improve crop growth and yield estimations when combined. This study thus aims to systematically evaluate how research that exclusively focuses on assimilating RS data into crop models varies among countries, crops, data assimilation methods, and farming conditions. A strict search string was applied in the Scopus and Web of Science databases, and 497 potential publications were obtained. After screening for relevance with predefined inclusion/exclusion criteria, 123 publications were considered in the final review. Results indicate that over 81% of the studies were conducted in countries associated with high socio-economic and technological advancement, mainly China, the United States of America, France, Germany, and Italy. Many of these studies integrated MODIS or Landsat data into WOFOST to improve crop growth and yield estimation of staple crops at the field and regional scales. Most studies use recalibration or updating methods alongside various algorithms to assimilate remotely sensed leaf area index into crop models. However, these methods cannot account for the uncertainties in remote sensing observations and the crop model itself. l. Over 85% of the studies were based on commercial and irrigated farming systems. Despite a great global interest in data assimilation into crop models, limited research has been conducted in resource- and data-limited regions like Africa. We foresee a great potential for such application in those conditions. Hence facilitating and expanding the use of such an approach, from which developing farming communities could benefit.Keywords: crop models, remote sensing, data assimilation, crop yield estimation
Procedia PDF Downloads 82491 The Role of Institutions in Community Wildlife Conservation in Zimbabwe
Authors: Herbert Ntuli, Edwin Muchapondwa
Abstract:
This study used a sample of 336 households and community level data from 30 communities around the Gonarezhou National Park in Zimbabwe to analyse the association between ability to self-organize or cooperation and institutions on one hand and the relationship between success of biodiversity outcomes and cooperation on the other hand. Using both the ordinary least squares and instrumental variables estimation with heteroskedasticity-based instruments, our results confirmed that sound institutions are indeed an important ingredient for cooperation in the respective communities and cooperation positively and significantly affects biodiversity outcomes. Group size, community level trust, the number of stakeholders and punishment were found to be important variables explaining cooperation. From a policy perspective, our results show that external enforcement of rules and regulations does not necessarily translate into sound ecological outcomes but better outcomes are attainable when punishment is rather endogenized by local communities. This seems to suggest that communities should rather be supported in such a way that robust institutions that are tailor made to suit the needs of local condition will emerge that will in turn facilitate good environmental husbandry. Cooperation, training, benefits, distance from the nearest urban canter, distance from the fence, social capital average age of household head, fence and information sharing were found to be very important variables explaining the success of biodiversity outcomes ceteris paribus. Government programmes should target capacity building in terms of institutional capacity and skills development in order to have a positive impact on biodiversity. Hence, the role of stakeholders (e.g., NGOs) in capacity building and government effort should complement each other to ensure that the necessary resources are mobilized and all communities receive the necessary training and resources.Keywords: institutions, self-organize, common pool resources, wildlife, conservation, Zimbabwe
Procedia PDF Downloads 281490 Computational Aided Approach for Strut and Tie Model for Non-Flexural Elements
Authors: Mihaja Razafimbelo, Guillaume Herve-Secourgeon, Fabrice Gatuingt, Marina Bottoni, Tulio Honorio-De-Faria
Abstract:
The challenge of the research is to provide engineering with a robust, semi-automatic method for calculating optimal reinforcement for massive structural elements. In the absence of such a digital post-processing tool, design office engineers make intensive use of plate modelling, for which automatic post-processing is available. Plate models in massive areas, on the other hand, produce conservative results. In addition, the theoretical foundations of automatic post-processing tools for reinforcement are those of reinforced concrete beam sections. As long as there is no suitable alternative for automatic post-processing of plates, optimal modelling and a significant improvement of the constructability of massive areas cannot be expected. A method called strut-and-tie is commonly used in civil engineering, but the result itself remains very subjective to the calculation engineer. The tool developed will facilitate the work of supporting the engineers in their choice of structure. The method implemented consists of defining a ground-structure built on the basis of the main constraints resulting from an elastic analysis of the structure and then to start an optimization of this structure according to the fully stressed design method. The first results allow to obtain a coherent return in the first network of connecting struts and ties, compared to the cases encountered in the literature. The evolution of the tool will then make it possible to adapt the obtained latticework in relation to the cracking states resulting from the loads applied during the life of the structure, cyclic or dynamic loads. In addition, with the constructability constraint, a final result of reinforcement with an orthogonal arrangement with a regulated spacing will be implemented in the tool.Keywords: strut and tie, optimization, reinforcement, massive structure
Procedia PDF Downloads 141489 Development of a Predictive Model to Prevent Financial Crisis
Authors: Tengqin Han
Abstract:
Delinquency has been a crucial factor in economics throughout the years. Commonly seen in credit card and mortgage, it played one of the crucial roles in causing the most recent financial crisis in 2008. In each case, a delinquency is a sign of the loaner being unable to pay off the debt, and thus may cause a lost of property in the end. Individually, one case of delinquency seems unimportant compared to the entire credit system. China, as an emerging economic entity, the national strength and economic strength has grown rapidly, and the gross domestic product (GDP) growth rate has remained as high as 8% in the past decades. However, potential risks exist behind the appearance of prosperity. Among the risks, the credit system is the most significant one. Due to long term and a large amount of balance of the mortgage, it is critical to monitor the risk during the performance period. In this project, about 300,000 mortgage account data are analyzed in order to develop a predictive model to predict the probability of delinquency. Through univariate analysis, the data is cleaned up, and through bivariate analysis, the variables with strong predictive power are detected. The project is divided into two parts. In the first part, the analysis data of 2005 are split into 2 parts, 60% for model development, and 40% for in-time model validation. The KS of model development is 31, and the KS for in-time validation is 31, indicating the model is stable. In addition, the model is further validation by out-of-time validation, which uses 40% of 2006 data, and KS is 33. This indicates the model is still stable and robust. In the second part, the model is improved by the addition of macroeconomic economic indexes, including GDP, consumer price index, unemployment rate, inflation rate, etc. The data of 2005 to 2010 is used for model development and validation. Compared with the base model (without microeconomic variables), KS is increased from 41 to 44, indicating that the macroeconomic variables can be used to improve the separation power of the model, and make the prediction more accurate.Keywords: delinquency, mortgage, model development, model validation
Procedia PDF Downloads 228488 Use Cloud-Based Watson Deep Learning Platform to Train Models Faster and More Accurate
Authors: Susan Diamond
Abstract:
Machine Learning workloads have traditionally been run in high-performance computing (HPC) environments, where users log in to dedicated machines and utilize the attached GPUs to run training jobs on huge datasets. Training of large neural network models is very resource intensive, and even after exploiting parallelism and accelerators such as GPUs, a single training job can still take days. Consequently, the cost of hardware is a barrier to entry. Even when upfront cost is not a concern, the lead time to set up such an HPC environment takes months from acquiring hardware to set up the hardware with the right set of firmware, software installed and configured. Furthermore, scalability is hard to achieve in a rigid traditional lab environment. Therefore, it is slow to react to the dynamic change in the artificial intelligent industry. Watson Deep Learning as a service, a cloud-based deep learning platform that mitigates the long lead time and high upfront investment in hardware. It enables robust and scalable sharing of resources among the teams in an organization. It is designed for on-demand cloud environments. Providing a similar user experience in a multi-tenant cloud environment comes with its own unique challenges regarding fault tolerance, performance, and security. Watson Deep Learning as a service tackles these challenges and present a deep learning stack for the cloud environments in a secure, scalable and fault-tolerant manner. It supports a wide range of deep-learning frameworks such as Tensorflow, PyTorch, Caffe, Torch, Theano, and MXNet etc. These frameworks reduce the effort and skillset required to design, train, and use deep learning models. Deep Learning as a service is used at IBM by AI researchers in areas including machine translation, computer vision, and healthcare.Keywords: deep learning, machine learning, cognitive computing, model training
Procedia PDF Downloads 209487 Hyperspectral Imaging and Nonlinear Fukunaga-Koontz Transform Based Food Inspection
Authors: Hamidullah Binol, Abdullah Bal
Abstract:
Nowadays, food safety is a great public concern; therefore, robust and effective techniques are required for detecting the safety situation of goods. Hyperspectral Imaging (HSI) is an attractive material for researchers to inspect food quality and safety estimation such as meat quality assessment, automated poultry carcass inspection, quality evaluation of fish, bruise detection of apples, quality analysis and grading of citrus fruits, bruise detection of strawberry, visualization of sugar distribution of melons, measuring ripening of tomatoes, defect detection of pickling cucumber, and classification of wheat kernels. HSI can be used to concurrently collect large amounts of spatial and spectral data on the objects being observed. This technique yields with exceptional detection skills, which otherwise cannot be achieved with either imaging or spectroscopy alone. This paper presents a nonlinear technique based on kernel Fukunaga-Koontz transform (KFKT) for detection of fat content in ground meat using HSI. The KFKT which is the nonlinear version of FKT is one of the most effective techniques for solving problems involving two-pattern nature. The conventional FKT method has been improved with kernel machines for increasing the nonlinear discrimination ability and capturing higher order of statistics of data. The proposed approach in this paper aims to segment the fat content of the ground meat by regarding the fat as target class which is tried to be separated from the remaining classes (as clutter). We have applied the KFKT on visible and nearinfrared (VNIR) hyperspectral images of ground meat to determine fat percentage. The experimental studies indicate that the proposed technique produces high detection performance for fat ratio in ground meat.Keywords: food (ground meat) inspection, Fukunaga-Koontz transform, hyperspectral imaging, kernel methods
Procedia PDF Downloads 431486 Elucidating Microstructural Evolution Mechanisms in Tungsten via Layerwise Rolling in Additive Manufacturing: An Integrated Simulation and Experimental Approach
Authors: Sadman Durlov, Aditya Ganesh-Ram, Hamidreza Hekmatjou, Md Najmus Salehin, Nora Shayesteh Ameri
Abstract:
In the field of additive manufacturing, tungsten stands out for its exceptional resistance to high temperatures, making it an ideal candidate for use in extreme conditions. However, its inherent brittleness and vulnerability to thermal cracking pose significant challenges to its manufacturability. This study explores the microstructural evolution of tungsten processed through layer-wise rolling in laser powder bed fusion additive manufacturing, utilizing a comprehensive approach that combines advanced simulation techniques with empirical research. We aim to uncover the complex processes of plastic deformation and microstructural transformations, with a particular focus on the dynamics of grain size, boundary evolution, and phase distribution. Our methodology employs a combination of simulation and experimental data, allowing for a detailed comparison that elucidates the key mechanisms influencing microstructural alterations during the rolling process. This approach facilitates a deeper understanding of the material's behavior under additive manufacturing conditions, specifically in terms of deformation and recrystallization. The insights derived from this research not only deepen our theoretical knowledge but also provide actionable strategies for refining manufacturing parameters to improve the tungsten components' mechanical properties and functional performance. By integrating simulation with practical experimentation, this study significantly enhances the field of materials science, offering a robust framework for the development of durable materials suited for challenging operational environments. Our findings pave the way for optimizing additive manufacturing techniques and expanding the use of tungsten across various demanding sectors.Keywords: additive manufacturing, layer wise rolling, refractory materials, in-situ microstructure modifications
Procedia PDF Downloads 61485 Object Detection in Digital Images under Non-Standardized Conditions Using Illumination and Shadow Filtering
Authors: Waqqas-ur-Rehman Butt, Martin Servin, Marion Pause
Abstract:
In recent years, object detection has gained much attention and very encouraging research area in the field of computer vision. The robust object boundaries detection in an image is demanded in numerous applications of human computer interaction and automated surveillance systems. Many methods and approaches have been developed for automatic object detection in various fields, such as automotive, quality control management and environmental services. Inappropriately, to the best of our knowledge, object detection under illumination with shadow consideration has not been well solved yet. Furthermore, this problem is also one of the major hurdles to keeping an object detection method from the practical applications. This paper presents an approach to automatic object detection in images under non-standardized environmental conditions. A key challenge is how to detect the object, particularly under uneven illumination conditions. Image capturing conditions the algorithms need to consider a variety of possible environmental factors as the colour information, lightening and shadows varies from image to image. Existing methods mostly failed to produce the appropriate result due to variation in colour information, lightening effects, threshold specifications, histogram dependencies and colour ranges. To overcome these limitations we propose an object detection algorithm, with pre-processing methods, to reduce the interference caused by shadow and illumination effects without fixed parameters. We use the Y CrCb colour model without any specific colour ranges and predefined threshold values. The segmented object regions are further classified using morphological operations (Erosion and Dilation) and contours. Proposed approach applied on a large image data set acquired under various environmental conditions for wood stack detection. Experiments show the promising result of the proposed approach in comparison with existing methods.Keywords: image processing, illumination equalization, shadow filtering, object detection
Procedia PDF Downloads 216484 Determinants of Green Strategy: Analysis Using Probit and Logit Models
Authors: Ayushi Modi, Eliot Bochet-Merand
Abstract:
This study investigates the structural determinants of green strategies among Small and Medium Enterprises (SMEs) in the European Union and select countries, utilizing data from the Flash Eurobarometer 498 - SMEs, Resource Efficiency, and Green Markets. By applying sequential logit analysis, we explore the drivers behind the adoption and scaling of green actions, such as resource efficiency, waste management, and product innovation, while also examining the provision of green products and services. A key contribution of this research is the novel distinction between the process stage (green actions) and the product stage (green outputs), allowing for a deeper analysis of how green initiatives translate into sustainable business outcomes. Our findings reveal that structural characteristics, such as firm size, sector, and turnover growth, significantly influence the likelihood of both providing green products and implementing comprehensive green actions. Smaller, younger firms in high-impact sectors like construction and industry are more likely to engage in sustainability efforts, particularly when they have a green strategy and a dedicated green workforce. Furthermore, companies serving B2B and B2C clients and experiencing turnover growth are more inclined to offer green products. The study underscores the economic implications of these insights, suggesting that financial flexibility, strategic commitment, and human capital investments are critical for scaling green initiatives. By refining variables and excluding heterogeneous countries, our data management ensures robust results. This research provides novel insights into the distinct roles of process and product stages in sustainability, offering valuable policy recommendations for promoting environmental performance in SMEs.Keywords: green strategy, resource efficiency, SMES, sustainability, product innovation, environmental performance
Procedia PDF Downloads 18483 Evaluating the Impact of English Immersion in Kolkata’s High-Cost Private Schools
Authors: Ashmita Bhattacharya
Abstract:
This study aims to investigate whether the English immersion experience offered by Kolkata’s high-cost private English-medium schools lead to additive or subtractive language learning outcomes for students. In India, English has increasingly become associated with power, social status, and socio-economic mobility. As a result, a proliferation of English-medium schools has emerged across Kolkata and the wider Indian context. While in some contexts, English language learning can be an additive experience, in others, it can be subtractive where proficiency in English is developed at the expense of students’ native language proficiency development. Subtractive educational experiences can potentially have severe implications, including heritage language loss, detachment from cultural roots, and a diminished sense of national identity. Thus, with the use of semi-structured interviews, the language practices and lived experiences of 12 former students who attended high-cost private English-medium schools in Kolkata were thoroughly explored. The data collected was thematically coded and analysis was conducted using the Thematic Analysis approach. The findings indicate that the English immersion experience at Kolkata’s high-cost private English-medium schools provide a subtractive language learning experience to students. Additionally, this study suggests that robust home-based support for native languages might be crucial for mitigating the effects of subtractive English education. Furthermore, the study underscores the importance of integrating opportunities within schools that promote Indian languages and cultures as it can create a more positive, inclusive, and culturally responsive environment. Finally, although subject to further evaluation, the study recommends the implementation of bilingual and multilingual educational systems and provides suggestions for future research in this area.Keywords: bilingual education, English immersion, language loss, multilingual education, subtractive language learning
Procedia PDF Downloads 29482 Analysis and Design of Inductive Power Transfer Systems for Automotive Battery Charging Applications
Authors: Wahab Ali Shah, Junjia He
Abstract:
Transferring electrical power without any wiring has been a dream since late 19th century. There were some advances in this area as to know more about microwave systems. However, this subject has recently become very attractive due to their practiScal systems. There are low power applications such as charging the batteries of contactless tooth brushes or implanted devices, and higher power applications such as charging the batteries of electrical automobiles or buses. In the first group of applications operating frequencies are in microwave range while the frequency is lower in high power applications. In the latter, the concept is also called inductive power transfer. The aim of the paper is to have an overview of the inductive power transfer for electrical vehicles with a special concentration on coil design and power converter simulation for static charging. Coil design is very important for an efficient and safe power transfer. Coil design is one of the most critical tasks. Power converters are used in both side of the system. The converter on the primary side is used to generate a high frequency voltage to excite the primary coil. The purpose of the converter in the secondary is to rectify the voltage transferred from the primary to charge the battery. In this paper, an inductive power transfer system is studied. Inductive power transfer is a promising technology with several possible applications. Operation principles of these systems are explained, and components of the system are described. Finally, a single phase 2 kW system was simulated and results were presented. The work presented in this paper is just an introduction to the concept. A reformed compensation network based on traditional inductor-capacitor-inductor (LCL) topology is proposed to realize robust reaction to large coupling variation that is common in dynamic wireless charging application. In the future, this type compensation should be studied. Also, comparison of different compensation topologies should be done for the same power level.Keywords: coil design, contactless charging, electrical automobiles, inductive power transfer, operating frequency
Procedia PDF Downloads 249481 The Optimization of Sexual Health Resource Information and Services for Persons with Spinal Cord Injury
Authors: Nasrin Nejatbakhsh, Anita Kaiser, Sander Hitzig, Colleen McGillivray
Abstract:
Following spinal cord injury (SCI), many individuals experience anxiety in adjusting to their lives, and its impacts on their sexuality. Research has demonstrated that regaining sexual function is a very high priority for individuals with SCI. Despite this, sexual health is one of the least likely areas of focus in rehabilitating individuals with SCI. There is currently a considerable gap in appropriate education and resources that address sexual health concerns and needs of people with spinal cord injury. Furthermore, the determinants of sexual health in individuals with SCI are poorly understood and thus poorly addressed. The purpose of this study was to improve current practices by informing a service delivery model that rehabilitation centers can adopt for appropriate delivery of their services. Methodology: We utilized qualitative methods in the form of a semi-structured interview containing open-ended questions to assess 1) sexual health concerns, 2) helpful strategies in current resources, 3) unhelpful strategies in current resources, and 4) Barriers to obtaining sexual health information. In addition to the interviews, participants completed surveys to identify socio-demographic factors. Data gathered was coded and evaluated for emerging themes and subthemes through a ‘code-recode’ technique. Results: We have identified several robust themes that are important for SCI sexual health resource development. Through analysis of these themes and their subthemes, several important concepts have emerged that could provide agencies with helpful strategies for providing sexual health resources. Some of the important considerations are that services be; anonymous, accessible, frequent, affordable, mandatory, casual and supported by peers. Implications: By incorporating the perspectives of individuals with SCI, the finding from this study can be used to develop appropriate sexual health services and improve access to information through tailored needs based program development.Keywords: spinal cord injury, sexual health, determinants of health, resource development
Procedia PDF Downloads 251480 Analytical Model of Multiphase Machines Under Electrical Faults: Application on Dual Stator Asynchronous Machine
Authors: Nacera Yassa, Abdelmalek Saidoune, Ghania Ouadfel, Hamza Houassine
Abstract:
The rapid advancement in electrical technologies has underscored the increasing importance of multiphase machines across various industrial sectors. These machines offer significant advantages in terms of efficiency, compactness, and reliability compared to their single-phase counterparts. However, early detection and diagnosis of electrical faults remain critical challenges to ensure the durability and safety of these complex systems. This paper presents an advanced analytical model for multiphase machines, with a particular focus on dual stator asynchronous machines. The primary objective is to develop a robust diagnostic tool capable of effectively detecting and locating electrical faults in these machines, including short circuits, winding faults, and voltage imbalances. The proposed methodology relies on an analytical approach combining electrical machine theory, modeling of magnetic and electrical circuits, and advanced signal analysis techniques. By employing detailed analytical equations, the developed model accurately simulates the behavior of multiphase machines in the presence of electrical faults. The effectiveness of the proposed model is demonstrated through a series of case studies and numerical simulations. In particular, special attention is given to analyzing the dynamic behavior of machines under different types of faults, as well as optimizing diagnostic and recovery strategies. The obtained results pave the way for new advancements in the field of multiphase machine diagnostics, with potential applications in various sectors such as automotive, aerospace, and renewable energies. By providing precise and reliable tools for early fault detection, this research contributes to improving the reliability and durability of complex electrical systems while reducing maintenance and operation costs.Keywords: faults, diagnosis, modelling, multiphase machine
Procedia PDF Downloads 63479 A Systematic Review of the Methodological and Reporting Quality of Case Series in Surgery
Authors: Riaz A. Agha, Alexander J. Fowler, Seon-Young Lee, Buket Gundogan, Katharine Whitehurst, Harkiran K. Sagoo, Kyung Jin Lee Jeong, Douglas G. Altman, Dennis P. Orgill
Abstract:
Introduction: Case Series are an important and common study type. Currently, no guideline exists for reporting case series and there is evidence of key data being missed from such reports. We propose to develop a reporting guideline for case series using a methodologically robust technique. The first step in this process is a systematic review of literature relevant to the reporting deficiencies of case series. Methods: A systematic review of methodological and reporting quality in surgical case series was performed. The electronic search strategy was developed by an information specialist and included MEDLINE, EMBASE, Cochrane Methods Register, Science Citation index and Conference Proceedings Citation index, from the start of indexing until 5th November 2014. Independent screening, eligibility assessments and data extraction was performed. Included articles were analyzed for five areas of deficiency: failure to use standardized definitions missing or selective data transparency or incomplete reporting whether alternate study designs were considered. Results: The database searching identified 2,205 records. Through the process of screening and eligibility assessments, 92 articles met inclusion criteria. Frequency of methodological and reporting issues identified was a failure to use standardized definitions (57%), missing or selective data (66%), transparency, or incomplete reporting (70%), whether alternate study designs were considered (11%) and other issues (52%). Conclusion: The methodological and reporting quality of surgical case series needs improvement. Our data shows that clear evidence-based guidelines for the conduct and reporting of a case series may be useful to those planning or conducting them.Keywords: case series, reporting quality, surgery, systematic review
Procedia PDF Downloads 359478 A Monolithic Arbitrary Lagrangian-Eulerian Finite Element Strategy for Partly Submerged Solid in Incompressible Fluid with Mortar Method for Modeling the Contact Surface
Authors: Suman Dutta, Manish Agrawal, C. S. Jog
Abstract:
Accurate computation of hydrodynamic forces on floating structures and their deformation finds application in the ocean and naval engineering and wave energy harvesting. This manuscript presents a monolithic, finite element strategy for fluid-structure interaction involving hyper-elastic solids partly submerged in an incompressible fluid. A velocity-based Arbitrary Lagrangian-Eulerian (ALE) formulation has been used for the fluid and a displacement-based Lagrangian approach has been used for the solid. The flexibility of the ALE technique permits us to treat the free surface of the fluid as a Lagrangian entity. At the interface, the continuity of displacement, velocity and traction are enforced using the mortar method. In the mortar method, the constraints are enforced in a weak sense using the Lagrange multiplier method. In the literature, the mortar method has been shown to be robust in solving various contact mechanics problems. The time-stepping strategy used in this work reduces to the generalized trapezoidal rule in the Eulerian setting. In the Lagrangian limit, in the absence of external load, the algorithm conserves the linear and angular momentum and the total energy of the system. The use of monolithic coupling with an energy-conserving time-stepping strategy gives an unconditionally stable algorithm and allows the user to take large time steps. All the governing equations and boundary conditions have been mapped to the reference configuration. The use of the exact tangent stiffness matrix ensures that the algorithm converges quadratically within each time step. The robustness and good performance of the proposed method are demonstrated by solving benchmark problems from the literature.Keywords: ALE, floating body, fluid-structure interaction, monolithic, mortar method
Procedia PDF Downloads 274477 A Sociological Investigation on the Population and Public Spaces of Nguyen Cong Tru, a Soviet-Style Collective Housing Complex in Hanoi in Regards to Its New Community-Focused Architectural Design
Authors: Duy Nguyen Do, Bart Julien Dewancker
Abstract:
Many Soviet-style collective housing complexes (also known as KTT) were built since the 1960s in Hanoi to support the post-war population growth. Those low-rise buildings have created well-knitted, robust communities, so much to the point that in most complexes, all families in one housing block would know each other, occasionally interact and provide supports in need. To understand how the community of collective housing complexes have developed and maintained in order to adapt their advantages into modern housing designs, the study is executed on the site of Nguyen Cong Tru KTT. This is one of the oldest KTT in Hanoi, completed in 1954. The complex also has an unique characteristic that is closely related to its community: the symbiotic relationship with Hom – a flea market that has been co-developing with Nguyen Cong Tru KTT since its beginning. The research consists of three phases: the first phase is a sociological investigation with Nguyen Cong Tru KTT’s current residents and a site survey on the complex’s economic and architectural characteristics. In the second phase, the collected data is analyzed to find out people’s opinions with the KTT’s concerning their satisfaction with the current housing status, floor plan organization, community, the relationship between the KTT’s dedicated public spaces with the flea market and their usage. Simultaneously, the master plan and gathered information regarding current architectural characteristics of the complex are also inspected. On the third phase, the analyses’ results will provide information regarding the issues, positive trends and significant historical features of the complex’s architecture in order to generate suitable proposals for the redesigning project of Nguyen Cong Tru KTT, a design focused on vitalizing modern apartments’ communities.Keywords: collective house community, collective house public space, community-focused, redesigning Nguyen Cong Tru KTT, sociological investigation
Procedia PDF Downloads 363476 A Deep Learning Approach to Real Time and Robust Vehicular Traffic Prediction
Authors: Bikis Muhammed, Sehra Sedigh Sarvestani, Ali R. Hurson, Lasanthi Gamage
Abstract:
Vehicular traffic events have overly complex spatial correlations and temporal interdependencies and are also influenced by environmental events such as weather conditions. To capture these spatial and temporal interdependencies and make more realistic vehicular traffic predictions, graph neural networks (GNN) based traffic prediction models have been extensively utilized due to their capability of capturing non-Euclidean spatial correlation very effectively. However, most of the already existing GNN-based traffic prediction models have some limitations during learning complex and dynamic spatial and temporal patterns due to the following missing factors. First, most GNN-based traffic prediction models have used static distance or sometimes haversine distance mechanisms between spatially separated traffic observations to estimate spatial correlation. Secondly, most GNN-based traffic prediction models have not incorporated environmental events that have a major impact on the normal traffic states. Finally, most of the GNN-based models did not use an attention mechanism to focus on only important traffic observations. The objective of this paper is to study and make real-time vehicular traffic predictions while incorporating the effect of weather conditions. To fill the previously mentioned gaps, our prediction model uses a real-time driving distance between sensors to build a distance matrix or spatial adjacency matrix and capture spatial correlation. In addition, our prediction model considers the effect of six types of weather conditions and has an attention mechanism in both spatial and temporal data aggregation. Our prediction model efficiently captures the spatial and temporal correlation between traffic events, and it relies on the graph attention network (GAT) and Bidirectional bidirectional long short-term memory (Bi-LSTM) plus attention layers and is called GAT-BILSTMA.Keywords: deep learning, real time prediction, GAT, Bi-LSTM, attention
Procedia PDF Downloads 72475 Investigation of Ductile Failure Mechanisms in SA508 Grade 3 Steel via X-Ray Computed Tomography and Fractography Analysis
Authors: Suleyman Karabal, Timothy L. Burnett, Egemen Avcu, Andrew H. Sherry, Philip J. Withers
Abstract:
SA508 Grade 3 steel is widely used in the construction of nuclear pressure vessels, where its fracture toughness plays a critical role in ensuring operational safety and reliability. Understanding the ductile failure mechanisms in this steel grade is crucial for designing robust pressure vessels that can withstand severe nuclear environment conditions. In the present study, round bar specimens of SA508 Grade 3 steel with four distinct notch geometries were subjected to tensile loading while capturing continuous 2D images at 5-second intervals in order to monitor any alterations in their geometries to construct true stress-strain curves of the specimens. 3D reconstructions of X-ray computed tomography (CT) images at high-resolution (a spatial resolution of 0.82 μm) allowed for a comprehensive assessment of the influences of second-phase particles (i.e., manganese sulfide inclusions and cementite particles) on ductile failure initiation as a function of applied plastic strain. Additionally, based on 2D and 3D images, plasticity modeling was executed, and the results were compared to experimental data. A specific ‘two-parameter criterion’ was established and calibrated based on the correlation between stress triaxiality and equivalent plastic strain at failure initiation. The proposed criterion demonstrated substantial agreement with the experimental results, thus enhancing our knowledge of ductile fracture behavior in this steel grade. The implementation of X-ray CT and fractography analysis provided new insights into the diverse roles played by different populations of second-phase particles in fracture initiation under varying stress triaxiality conditions.Keywords: ductile fracture, two-parameter criterion, x-ray computed tomography, stress triaxiality
Procedia PDF Downloads 92474 Better Defined WHO International Classification of Disease Codes for Relapsing Fever Borreliosis, and Lyme Disease Education Aiding Diagnosis, Treatment Improving Human Right to Health
Authors: Mualla McManus, Jenna Luche Thaye
Abstract:
World Health Organisation International Classification of Disease codes were created to define disease including infections in order to guide and educate diagnosticians. Most infectious diseases such as syphilis are clearly defined by their ICD 10 codes and aid/help to educate the clinicians in syphilis diagnosis and treatment globally. However, current ICD 10 codes for relapsing fever Borreliosis and Lyme disease are less clearly defined and can impede appropriate diagnosis especially if the clinician is not familiar with the symptoms of these infectious diseases. This is despite substantial number of scientific articles published in peer-reviewed journals about relapsing fever and Lyme disease. In the USA there are estimated 380,000 people annually contacting Lyme disease, more cases than breast cancer and 6x HIV/AIDS cases. This represents estimated 0.09% of the USA population. If extrapolated to the global population (7billion), 0.09% equates to 63 million people contracting relapsing fever or Lyme disease. In many regions, the rate of contracting some form of infection from tick bite may be even higher. Without accurate and appropriate diagnostic codes, physicians are impeded in their ability to properly care for their patients, leaving those patients invisible and marginalized within the medical system and to those guiding public policy. This results in great personal hardship, pain, disability, and expense. This unnecessarily burdens health care systems, governments, families, and society as a whole. With accurate diagnostic codes in place, robust data can guide medical and public health research, health policy, track mortality and save health care dollars. Better defined ICD codes are the way forward in educating the diagnosticians about relapsing fever and Lyme diseases.Keywords: WHO ICD codes, relapsing fever, Lyme diseases, World Health Organisation
Procedia PDF Downloads 193473 Empirical Analysis of Forensic Accounting Practices for Tackling Persistent Fraud and Financial Irregularities in the Nigerian Public Sector
Authors: Sani AbdulRahman Bala
Abstract:
This empirical study delves into the realm of forensic accounting practices within the Nigerian Public Sector, seeking to quantitatively analyze their efficacy in addressing the persistent challenges of fraud and financial irregularities. With a focus on empirical data, this research employs a robust methodology to assess the current state of fraud in the Nigerian Public Sector and evaluate the performance of existing forensic accounting measures. Through quantitative analyses, including statistical models and data-driven insights, the study aims to identify patterns, trends, and correlations associated with fraudulent activities. The research objectives include scrutinizing documented fraud cases, examining the effectiveness of established forensic accounting practices, and proposing data-driven strategies for enhancing fraud detection and prevention. Leveraging quantitative methodologies, the study seeks to measure the impact of technological advancements on forensic accounting accuracy and efficiency. Additionally, the research explores collaborative mechanisms among government agencies, regulatory bodies, and the private sector by quantifying the effects of information sharing on fraud prevention. The empirical findings from this study are expected to provide a nuanced understanding of the challenges and opportunities in combating fraud within the Nigerian Public Sector. The quantitative insights derived from real-world data will contribute to the refinement of forensic accounting strategies, ensuring their effectiveness in addressing the unique complexities of financial irregularities in the public sector. The study's outcomes aim to inform policymakers, practitioners, and stakeholders, fostering evidence-based decision-making and proactive measures for a more resilient and fraud-resistant financial governance system in Nigeria.Keywords: fraud, financial irregularities, nigerian public sector, quantitative investigation
Procedia PDF Downloads 62472 The Advantages of Using DNA-Barcoding for Determining the Fraud in Seafood
Authors: Elif Tugce Aksun Tumerkan
Abstract:
Although seafood is an important part of human diet and categorized highly traded food industry internationally, it is remain overlooked generally in the global food security aspect. Food product authentication is the main interest in the aim of both avoids commercial fraud and to consider the risks that might be harmful to human health safety. In recent years, with increasing consumer demand for regarding food content and it's transparency, there are some instrumental analyses emerging for determining food fraud depend on some analytical methodologies such as proteomic and metabolomics. While, fish and seafood consumed as fresh previously, within advanced technology, processed or packaged seafood consumption have increased. After processing or packaging seafood, morphological identification is impossible when some of the external features have been removed. The main fish and seafood quality-related issues are the authentications of seafood contents such as mislabelling products which may be contaminated and replacement partly or completely, by lower quality or cheaper ones. For all mentioned reasons, truthful consistent and easily applicable analytical methods are needed for assurance the correct labelling and verifying of seafood products. DNA-barcoding methods become popular robust that used in taxonomic research for endangered or cryptic species in recent years; they are used for determining food traceability also. In this review, when comparing the other proteomic and metabolic analysis, DNA-based methods are allowing a chance to identification all type of food even as raw, spiced and processed products. This privilege caused by DNA is a comparatively stable molecule than protein and other molecules. Furthermore showing variations in sequence based on different species and founding in all organisms, make DNA-based analysis more preferable. This review was performed to clarify the main advantages of using DNA-barcoding for determining seafood fraud among other techniques.Keywords: DNA-barcoding, genetic analysis, food fraud, mislabelling, packaged seafood
Procedia PDF Downloads 168471 Optimal Tamping for Railway Tracks, Reducing Railway Maintenance Expenditures by the Use of Integer Programming
Authors: Rui Li, Min Wen, Kim Bang Salling
Abstract:
For the modern railways, maintenance is critical for ensuring safety, train punctuality and overall capacity utilization. The cost of railway maintenance in Europe is high, on average between 30,000 – 100,000 Euros per kilometer per year. In order to reduce such maintenance expenditures, this paper presents a mixed 0-1 linear mathematical model designed to optimize the predictive railway tamping activities for ballast track in the planning horizon of three to four years. The objective function is to minimize the tamping machine actual costs. The approach of the research is using the simple dynamic model for modelling condition-based tamping process and the solution method for finding optimal condition-based tamping schedule. Seven technical and practical aspects are taken into account to schedule tamping: (1) track degradation of the standard deviation of the longitudinal level over time; (2) track geometrical alignment; (3) track quality thresholds based on the train speed limits; (4) the dependency of the track quality recovery on the track quality after tamping operation; (5) Tamping machine operation practices (6) tamping budgets and (7) differentiating the open track from the station sections. A Danish railway track between Odense and Fredericia with 42.6 km of length is applied for a time period of three and four years in the proposed maintenance model. The generated tamping schedule is reasonable and robust. Based on the result from the Danish railway corridor, the total costs can be reduced significantly (50%) than the previous model which is based on optimizing the number of tamping. The different maintenance strategies have been discussed in the paper. The analysis from the results obtained from the model also shows a longer period of predictive tamping planning has more optimal scheduling of maintenance actions than continuous short term preventive maintenance, namely yearly condition-based planning.Keywords: integer programming, railway tamping, predictive maintenance model, preventive condition-based maintenance
Procedia PDF Downloads 443470 Passively Q-Switched 914 nm Microchip Laser for LIDAR Systems
Authors: Marco Naegele, Klaus Stoppel, Thomas Dekorsy
Abstract:
Passively Q-switched microchip lasers enable the great potential for sophisticated LiDAR systems due to their compact overall system design, excellent beam quality, and scalable pulse energies. However, many near-infrared solid-state lasers show emitting wavelengths > 1000 nm, which are not compatible with state-of-the-art silicon detectors. Here we demonstrate a passively Q-switched microchip laser operating at 914 nm. The microchip laser consists of a 3 mm long Nd:YVO₄ crystal as a gain medium, while Cr⁴⁺:YAG with an initial transmission of 98% is used as a saturable absorber. Quasi-continuous pumping enables single pulse operation, and low duty cycles ensure low overall heat generation and power consumption. Thus, thermally induced instabilities are minimized, and operation without active cooling is possible while ambient temperature changes are compensated by adjustment of the pump laser current only. Single-emitter diode pumping at 808 nm leads to a compact overall system design and robust setup. Utilization of a microchip cavity approach ensures single-longitudinal mode operation with spectral bandwidths in the picometer regime and results in short laser pulses with pulse durations below 10 ns. Beam quality measurements reveal an almost diffraction-limited beam and enable conclusions concerning the thermal lens, which is essential to stabilize the plane-plane resonator. A 7% output coupler transmissivity is used to generate pulses with energies in the microjoule regime and peak powers of more than 600 W. Long-term pulse duration, pulse energy, central wavelength, and spectral bandwidth measurements emphasize the excellent system stability and facilitate the utilization of this laser in the context of a LiDAR system.Keywords: diode-pumping, LiDAR system, microchip laser, Nd:YVO4 laser, passively Q-switched
Procedia PDF Downloads 129469 The Impact of Oxytetracycline on the Aquaponic System, Biofilter, and Plants
Authors: Hassan Alhoujeiri, Angele Matrat, Sandra Beaufort, Claire joaniss Cassan, Jerome Silvester
Abstract:
Aquaponics is a sustainable food production technology, and its transition to industrial-scale systems has created several challenges that require further investigation in order to make it a robust process. One of the critical concerns is the potential accumulation of compounds from veterinary treatments, phytosanitary agents, fish feed, or simply from contaminated water sources. The accumulation of these substances could negatively impact fish health, microbial biofilters, and plant growth, thereby disrupting the system’s overall balance and functionality. The lack of legislation and knowledge regarding the presence of such compounds in aquaponic systems raises concerns about their potential impact on both system balance and food safety. In this study, we focused on the effects of oxytetracycline (OTC), an antibiotic commonly used in aquaculture, on both the microbial biofilter and plant growth. Although OTC is rarely applied in aquaponics today, the fish compartment may need to be isolated from the system during treatment, as it inhibits specific bacterial populations, which could affect the microbial biofilter's efficiency. However, questions remain about the aquaponic system's tolerance threshold, particularly in cases of treatment or residual OTC traces post-treatment. This study results indicated a decline in microbial biofilter activity to 20% compared to the control, potentially corresponding to treatments of 41 mg/L of OTC. Analysis of microbial populations in the biofilter, using flow cytometry and microscopy (confocal and scanning electron microscopy), revealed an increase in bacterial mortality without disrupting the microbial biofilm. Additionally, OTC exposure led to noticeable changes in plant morphology (e.g., color) and growth, though it did not fully inhibit development. However, no significant effects were observed on seed germination at the tested concentrations despite a measurable impact on subsequent plant growth.Keywords: aquaponic, oxytetracycline, nitrifying biofilter, plant, micropollutants, sustainability
Procedia PDF Downloads 18468 The Food Security and Nutritional Diversity Impacts of Coupling Rural Infrastructure and Value Chain Development: Evidence from a Generalized Propensity Score Analysis
Authors: Latif Apaassongo Ibrahim, Owusu-Addo Ebenezer, Isaac Bonuedo
Abstract:
Structural barriers - including inadequate infrastructure, poor market linkages, and limited access to financial and extension services - have been the major constraints to improved welfare in the semi-arid regions of Ghana; food insecurity and malnutrition are persistent. The effects of infrastructural improvements as countermeasures are often misdirected by confounding effects of other economic, social, and environmental variables. This study applies Directed Acyclic Graphs (DAGs) to map the causal pathways between infrastructure development and household welfare, identifying key mediators and confounders for one such initiative in Ghana. Then, using Generalized Propensity Score (GPS) and Doubly Robust Estimation (IPWRA), this study evaluates the differential roles of government-supported infrastructure improvements in access and intensity of commercial relative to public infrastructure, on household food security and women’s nutritional diversity given three major value-chain improvements. The main findings suggest that these infrastructure improvements positively impact food security and nutrition, with women’s empowerment and nutritional education acting as key mediators. Market access emerged as a stronger causal mechanism relative to productivity gains in linking infrastructure to improved welfare. Membership in Farmer-Based Organizations (FBOs) and participation in agribusiness linkages further amplified these impacts. However, the effects of infrastructure improvements were less clear when combined with the adoption of climate resilience practices, suggesting potential trade-offs.Keywords: food security, nutrition, infrastructure, market access, women's empowerment, farmer-based organizations, climate resilience, Ghana
Procedia PDF Downloads 8467 Climate Change and Urban Flooding: The Need to Rethinking Urban Flood Management through Resilience
Authors: Suresh Hettiarachchi, Conrad Wasko, Ashish Sharma
Abstract:
The ever changing and expanding urban landscape increases the stress on urban systems to support and maintain safe and functional living spaces. Flooding presents one of the more serious threats to this safety, putting a larger number of people in harm’s way in congested urban settings. Climate change is adding to this stress by creating a dichotomy in the urban flood response. On the one hand, climate change is causing storms to intensify, resulting in more destructive, rarer floods, while on the other hand, longer dry periods are decreasing the severity of more frequent, less intense floods. This variability is creating a need to be more agile and innovative in how we design for and manage urban flooding. Here, we argue that to cope with this challenge climate change brings, we need to move towards urban flood management through resilience rather than flood prevention. We also argue that dealing with the larger variation in flood response to climate change means that we need to look at flooding from all aspects rather than the single-dimensional focus of flood depths and extents. In essence, we need to rethink how we manage flooding in the urban space. This change in our thought process and approach to flood management requires a practical way to assess and quantify resilience that is built into the urban landscape so that informed decision-making can support the required changes in planning and infrastructure design. Towards that end, we propose a Simple Urban Flood Resilience Index (SUFRI) based on a robust definition of resilience as a tool to assess flood resilience. The application of a simple resilience index such as the SUFRI can provide a practical tool that considers urban flood management in a multi-dimensional way and can present solutions that were not previously considered. When such an index is grounded on a clear and relevant definition of resilience, it can be a reliable and defensible way to assess and assist the process of adapting to the increasing challenges in urban flood management with climate change.Keywords: urban flood resilience, climate change, flood management, flood modelling
Procedia PDF Downloads 49466 Genetic Instabilities in Marine Bivalve Following Benzo(α)pyrene Exposure: Utilization of Combined Random Amplified Polymorphic DNA and Comet Assay
Authors: Mengjie Qu, Yi Wang, Jiawei Ding, Siyu Chen, Yanan Di
Abstract:
Marine ecosystem is facing intensified multiple stresses caused by environmental contaminants from human activities. Xenobiotics, such as benzo(α)pyrene (BaP) have been discharged into marine environment and cause hazardous impacts on both marine organisms and human beings. As a filter-feeder, marine mussels, Mytilus spp., has been extensively used to monitor the marine environment. However, their genomic alterations induced by such xenobiotics are still kept unknown. In the present study, gills, as the first defense barrier in mussels, were selected to evaluate the genetic instability alterations induced by the exposure to BaP both in vivo and in vitro. Both random amplified polymorphic DNA (RAPD) assay and comet assay were applied as the rapid tools to assess the environmental stresses due to their low money- and time-consumption. All mussels were identified to be the single species of Mytilus coruscus before used in BaP exposure at the concentration of 56 μg/l for 1 & 3 days (in vivo exposure) or 1 & 3 hours (in vitro). Both RAPD and comet assay results were showed significantly increased genomic instability with time-specific altering pattern. After the recovery period in 'in vivo' exposure, the genomic status was as same as control condition. However, the relative higher genomic instabilities were still observed in gill cells after the recovery from in vitro exposure condition. Different repair mechanisms or signaling pathway might be involved in the isolated gill cells in the comparison with intact tissues. The study provides the robust and rapid techniques to exam the genomic stability in marine organisms in response to marine environmental changes and provide basic information for further mechanism research in stress responses in marine organisms.Keywords: genotoxic impacts, in vivo/vitro exposure, marine mussels, RAPD and comet assay
Procedia PDF Downloads 279465 Dynamic Modeling of Advanced Wastewater Treatment Plants Using BioWin
Authors: Komal Rathore, Aydin Sunol, Gita Iranipour, Luke Mulford
Abstract:
Advanced wastewater treatment plants have complex biological kinetics, time variant influent flow rates and long processing times. Due to these factors, the modeling and operational control of advanced wastewater treatment plants become complicated. However, development of a robust model for advanced wastewater treatment plants has become necessary in order to increase the efficiency of the plants, reduce energy costs and meet the discharge limits set by the government. A dynamic model was designed using the Envirosim (Canada) platform software called BioWin for several wastewater treatment plants in Hillsborough County, Florida. Proper control strategies for various parameters such as mixed liquor suspended solids, recycle activated sludge and waste activated sludge were developed for models to match the plant performance. The models were tuned using both the influent and effluent data from the plant and their laboratories. The plant SCADA was used to predict the influent wastewater rates and concentration profiles as a function of time. The kinetic parameters were tuned based on sensitivity analysis and trial and error methods. The dynamic models were validated by using experimental data for influent and effluent parameters. The dissolved oxygen measurements were taken to validate the model by coupling them with Computational Fluid Dynamics (CFD) models. The Biowin models were able to exactly mimic the plant performance and predict effluent behavior for extended periods. The models are useful for plant engineers and operators as they can take decisions beforehand by predicting the plant performance with the use of BioWin models. One of the important findings from the model was the effects of recycle and wastage ratios on the mixed liquor suspended solids. The model was also useful in determining the significant kinetic parameters for biological wastewater treatment systems.Keywords: BioWin, kinetic modeling, flowsheet simulation, dynamic modeling
Procedia PDF Downloads 154464 The Determinants of Enterprise Risk Management: Literature Review, and Future Research
Authors: Sylvester S. Horvey, Jones Mensah
Abstract:
The growing complexities and dynamics in the business environment have led to a new approach to risk management, known as enterprise risk management (ERM). ERM is a system and an approach to managing the risks of an organization in an integrated manner to achieve the corporate goals and strategic objectives. Regardless of the diversities in the business environment, ERM has become an essential factor in managing individual and business risks because ERM is believed to enhance shareholder value and firm growth. Despite the growing number of literature on ERM, the question about what factors drives ERM remains limited. This study provides a comprehensive literature review of the main factors that contribute to ERM implementation. Google Scholar was the leading search engine used to identify empirical literature, and the review spanned between 2000 and 2020. Articles published in Scimago journal ranking and Scopus were examined. Thirteen firm characteristics and sixteen articles were considered for the empirical review. Most empirical studies agreed that firm size, institutional ownership, industry type, auditor type, industrial diversification, earnings volatility, stock price volatility, and internal auditor had a positive relationship with ERM adoption, whereas firm size, institutional ownership, auditor type, and type of industry were mostly seen be statistically significant. Other factors such as financial leverage, profitability, asset opacity, international diversification, and firm complexity revealed an inconclusive result. The growing literature on ERM is not without limitations; hence, this study suggests that further research should examine ERM determinants within a new geographical context while considering a new and robust way of measuring ERM rather than relying on a simple proxy (dummy) for ERM measurement. Other firm characteristics such as organizational culture and context, corporate scandals and losses, and governance could be considered determinants of ERM adoption.Keywords: enterprise risk management, determinants, ERM adoption, literature review
Procedia PDF Downloads 173