Search results for: dual phase lag model
8257 Infestation in Omani Date Palm Orchards by Dubas Bug Is Related to Tree Density
Authors: Lalit Kumar, Rashid Al Shidi
Abstract:
Phoenix dactylifera (date palm) is a major crop in many middle-eastern countries, including Oman. The Dubas bug Ommatissus lybicus is the main pest that affects date palm crops. However not all plantations are infested. It is still uncertain why some plantations get infested while others are not. This research investigated whether tree density and the system of planting (random versus systematic) had any relationship with infestation and levels of infestation. Remote Sensing and Geographic Information Systems were used to determine the density of trees (number of trees per unit area) while infestation levels were determined by manual counting of insects on 40 leaflets from two fronds on each tree, with a total of 20-60 trees in each village. The infestation was recorded as the average number of insects per leaflet. For tree density estimation, WorldView-3 scenes, with eight bands and 2m spatial resolution, were used. The Local maxima method, which depends on locating of the pixel of highest brightness inside a certain exploration window, was used to identify the trees in the image and delineating individual trees. This information was then used to determine whether the plantation was random or systematic. The ordinary least square regression (OLS) was used to test the global correlation between tree density and infestation level and the Geographic Weight Regression (GWR) was used to find the local spatial relationship. The accuracy of detecting trees varied from 83–99% in agricultural lands with systematic planting patterns to 50–70% in natural forest areas. Results revealed that the density of the trees in most of the villages was higher than the recommended planting number (120–125 trees/hectare). For infestation correlations, the GWR model showed a good positive significant relationship between infestation and tree density in the spring season with R² = 0.60 and medium positive significant relationship in the autumn season, with R² = 0.30. In contrast, the OLS model results showed a weaker positive significant relationship in the spring season with R² = 0.02, p < 0.05 and insignificant relationship in the autumn season with R² = 0.01, p > 0.05. The results showed a positive correlation between infestation and tree density, which suggests the infestation severity increased as the density of date palm trees increased. The correlation result showed that the density alone was responsible for about 60% of the increase in the infestation. This information can be used by the relevant authorities to better control infestations as well as to manage their pesticide spraying programs.Keywords: dubas bug, date palm, tree density, infestation levels
Procedia PDF Downloads 1938256 Analysis of Lift Force in Hydrodynamic Transport of a Finite Sized Particle in Inertial Microfluidics with a Rectangular Microchannel
Authors: Xinghui Wu, Chun Yang
Abstract:
Inertial microfluidics is a competitive fluidic method with applications in separation of particles, cells and bacteria. In contrast to traditional microfluidic devices with low Reynolds number, inertial microfluidics works in the intermediate Re number range which brings about several intriguing inertial effects on particle separation/focusing to meet the throughput requirement in the real-world. Geometric modifications to make channels become irregular shapes can leverage fluid inertia to create complex secondary flow for adjusting the particle equilibrium positions and thus enhance the separation resolution and throughput. Although inertial microfluidics has been extensively studied by experiments, our current understanding of its mechanisms is poor, making it extremely difficult to build rational-design guidelines for the particle focusing locations, especially for irregularly shaped microfluidic channels. Inertial particle microfluidics in irregularly shaped channels were investigated in our group. There are several fundamental issues that require us to address. One of them is about the balance between the inertial lift forces and the secondary drag forces. Also, it is critical to quantitatively describe the dependence of the life forces on particle-particle interactions in irregularly shaped channels, such as a rectangular one. To provide physical insights into the inertial microfluidics in channels of irregular shapes, in this work the immersed boundary-lattice Boltzmann method (IB-LBM) was introduced and validated to explore the transport characteristics and the underlying mechanisms of an inertial focusing single particle in a rectangular microchannel. The transport dynamics of a finitesized particle were investigated over wide ranges of Reynolds number (20 < Re < 500) and particle size. The results show that the inner equilibrium positions are more difficult to occur in the rectangular channel, which can be explained by the secondary flow caused by the presence of a finite-sized particle. Furthermore, force decoupling analysis was utilized to study the effect of each type of lift force on the inertia migration, and a theoretical model for the lateral lift force of a finite-sized particle in the rectangular channel was established. Such theoretical model can be used to provide theoretical guidance for the design and operation of inertial microfluidics.Keywords: inertial microfluidics, particle focuse, life force, IB-LBM
Procedia PDF Downloads 728255 PID Control of Quad-Rotor Unnamed Vehicle Based on Lagrange Approach Modelling
Authors: A. Benbouali, H. Saidi, A. Derrouazin, T. Bessaad
Abstract:
Aerial robotics is a very exciting research field dealing with a variety of subjects, including the attitude control. This paper deals with the control of a four rotor vertical take-off and landing (VTOL) Unmanned Aerial Vehicle. The paper presents a mathematical model based on the approach of Lagrange for the flight control of an autonomous quad-rotor. It also describes the controller architecture which is based on PID regulators. The control method has been simulated in closed loop in different situations. All the calculation stages and the simulation results have been detailed.Keywords: quad-rotor, lagrange approach, proportional integral derivate (PID) controller, Matlab/Simulink
Procedia PDF Downloads 4018254 Catalytic Activity Study of Fe, Ti Loaded TUD-1
Authors: Supakorn Tantisriyanurak, Hussaya Maneesuwan, Thanyalak Chaisuwan, Sujitra Wongkasemjit
Abstract:
TUD-1 is a siliceous mesoporous material with a three-dimensional amorphous structure of random, interconnecting pores, large pore size, high surface area (400-1000 m2/g), hydrothermal stability, and tunable porosity. However, the significant disadvantage of the mesoporous silicates is few catalytic active sites. In this work, a series of bimetallic Fe and Ti incorporated into TUD-1 framework is successfully synthesized by sol–gel method. The synthesized Fe,Ti-TUD-1 is characterized by various techniques. To study the catalytic activity of Fe, Ti–TUD-1, phenol hydroxylation was selected as a model reaction. The amounts of residual phenol and oxidation products were determined by high performance liquid chromatography coupled with UV-detector (HPLC-UV).Keywords: iron, phenol hydroxylation, titanium, TUD-1
Procedia PDF Downloads 2598253 Laser Powder Bed Fusion Awareness for Engineering Students in France and Qatar
Authors: Hiba Naccache, Rima Hleiss
Abstract:
Additive manufacturing AM or 3D printing is one of the pillars of Industry 4.0. Compared to traditional manufacturing, AM provides a prototype before production in order to optimize the design and avoid the stock market and uses strictly necessary material which can be recyclable, for the benefit of leaning towards local production, saving money, time and resources. Different types of AM exist and it has a broad range of applications across several industries like aerospace, automotive, medicine, education and else. The Laser Powder Bed Fusion (LPBF) is a metal AM technique that uses a laser to liquefy metal powder, layer by layer, to build a three-dimensional (3D) object. In industry 4.0 and aligned with the numbers 9 (Industry, Innovation and Infrastructure) and 12 (Responsible Production and Consumption) of the Sustainable Development Goals of the UNESCO 2030 Agenda, the AM’s manufacturers committed to minimizing the environmental impact by being sustainable in every production. The LPBF has several environmental advantages, like reduced waste production, lower energy consumption, and greater flexibility in creating components with lightweight and complex geometries. However, LPBF also have environmental drawbacks, like energy consumption, gas consumption and emissions. It is critical to recognize the environmental impacts of LPBF in order to mitigate them. To increase awareness and promote sustainable practices regarding LPBF, the researchers use the Elaboration Likelihood Model (ELM) theory where people from multiple universities in France and Qatar process information in two ways: peripherally and centrally. The peripheral campaigns use superficial cues to get attention, and the central campaigns provide clear and concise information. The authors created a seminar including a video showing LPBF production and a website with educational resources. The data is collected using questionnaire to test attitude about the public awareness before and after the seminar. The results reflected a great shift on the awareness toward LPBF and its impact on the environment. With no presence of similar research, to our best knowledge, this study will add to the literature on the sustainability of the LPBF production technique.Keywords: additive manufacturing, laser powder bed fusion, elaboration likelihood model theory, sustainable development goals, education-awareness, France, Qatar, specific energy consumption, environmental impact, lightweight components
Procedia PDF Downloads 908252 Modelling and Control of Electrohydraulic System Using Fuzzy Logic Algorithm
Authors: Hajara Abdulkarim Aliyu, Abdulbasid Ismail Isa
Abstract:
This research paper studies electrohydraulic system for its role in position and motion control system and develops as mathematical model describing the behaviour of the system. The research further proposes Fuzzy logic and conventional PID controllers in order to achieve both accurate positioning of the payload and overall improvement of the system performance. The simulation result shows Fuzzy logic controller has a superior tracking performance and high disturbance rejection efficiency for its shorter settling time, less overshoot, smaller values of integral of absolute and deviation errors over the conventional PID controller at all the testing conditions.Keywords: electrohydraulic, fuzzy logic, modelling, NZ-PID
Procedia PDF Downloads 4718251 Brain Age Prediction Based on Brain Magnetic Resonance Imaging by 3D Convolutional Neural Network
Authors: Leila Keshavarz Afshar, Hedieh Sajedi
Abstract:
Estimation of biological brain age from MR images is a topic that has been much addressed in recent years due to the importance it attaches to early diagnosis of diseases such as Alzheimer's. In this paper, we use a 3D Convolutional Neural Network (CNN) to provide a method for estimating the biological age of the brain. The 3D-CNN model is trained by MRI data that has been normalized. In addition, to reduce computation while saving overall performance, some effectual slices are selected for age estimation. By this method, the biological age of individuals using selected normalized data was estimated with Mean Absolute Error (MAE) of 4.82 years.Keywords: brain age estimation, biological age, 3D-CNN, deep learning, T1-weighted image, SPM, preprocessing, MRI, canny, gray matter
Procedia PDF Downloads 1488250 Scheduled Maintenance and Downtime Cost in Aircraft Maintenance Management
Authors: Remzi Saltoglu, Nazmia Humaira, Gokhan Inalhan
Abstract:
During aircraft maintenance scheduling, operator calculates the budget of the maintenance. Usually, this calculation includes only the costs that are directly related to the maintenance process such as cost of labor, material, and equipment. In some cases, overhead cost is also included. However, in some of those, downtime cost is neglected claiming that grounding is a natural fact of maintenance; therefore, it is not considered as part of the analytical decision-making process. Based on the normalized data, we introduce downtime cost with its monetary value and add its seasonal character. We envision that the rest of the model, which works together with the downtime cost, could be checked with the real life cases, through the review of MRO cost and airline spending in the particular and scheduled maintenance events.Keywords: aircraft maintenance, downtime, downtime cost, maintenance cost
Procedia PDF Downloads 3568249 Simulation of Human Heart Activation Based on Diffusion Tensor Imaging
Authors: Ihab Elaff
Abstract:
Simulating the heart’s electrical stimulation is essential in modeling and evaluating the electrophysiology behavior of the heart. For achieving that, there are two structures in concern: the ventricles’ Myocardium, and the ventricles’ Conduction Network. Ventricles’ Myocardium has been modeled as anisotropic material from Diffusion Tensor Imaging (DTI) scan, and the Conduction Network has been extracted from DTI as a case-based structure based on the biological properties of the heart tissues and the working methodology of the Magnetic Resonance Imaging (MRI) scanner. Results of the produced activation were much similar to real measurements of the reference model that was presented in the literature.Keywords: diffusion tensor, DTI, heart, conduction network, excitation propagation
Procedia PDF Downloads 2668248 Optimal Pricing Based on Real Estate Demand Data
Authors: Vanessa Kummer, Maik Meusel
Abstract:
Real estate demand estimates are typically derived from transaction data. However, in regions with excess demand, transactions are driven by supply and therefore do not indicate what people are actually looking for. To estimate the demand for housing in Switzerland, search subscriptions from all important Swiss real estate platforms are used. These data do, however, suffer from missing information—for example, many users do not specify how many rooms they would like or what price they would be willing to pay. In economic analyses, it is often the case that only complete data is used. Usually, however, the proportion of complete data is rather small which leads to most information being neglected. Also, the data might have a strong distortion if it is complete. In addition, the reason that data is missing might itself also contain information, which is however ignored with that approach. An interesting issue is, therefore, if for economic analyses such as the one at hand, there is an added value by using the whole data set with the imputed missing values compared to using the usually small percentage of complete data (baseline). Also, it is interesting to see how different algorithms affect that result. The imputation of the missing data is done using unsupervised learning. Out of the numerous unsupervised learning approaches, the most common ones, such as clustering, principal component analysis, or neural networks techniques are applied. By training the model iteratively on the imputed data and, thereby, including the information of all data into the model, the distortion of the first training set—the complete data—vanishes. In a next step, the performances of the algorithms are measured. This is done by randomly creating missing values in subsets of the data, estimating those values with the relevant algorithms and several parameter combinations, and comparing the estimates to the actual data. After having found the optimal parameter set for each algorithm, the missing values are being imputed. Using the resulting data sets, the next step is to estimate the willingness to pay for real estate. This is done by fitting price distributions for real estate properties with certain characteristics, such as the region or the number of rooms. Based on these distributions, survival functions are computed to obtain the functional relationship between characteristics and selling probabilities. Comparing the survival functions shows that estimates which are based on imputed data sets do not differ significantly from each other; however, the demand estimate that is derived from the baseline data does. This indicates that the baseline data set does not include all available information and is therefore not representative for the entire sample. Also, demand estimates derived from the whole data set are much more accurate than the baseline estimation. Thus, in order to obtain optimal results, it is important to make use of all available data, even though it involves additional procedures such as data imputation.Keywords: demand estimate, missing-data imputation, real estate, unsupervised learning
Procedia PDF Downloads 2908247 A Numerical Study on Micromechanical Aspects in Short Fiber Composites
Authors: I. Ioannou, I. M. Gitman
Abstract:
This study focused on the contribution of micro-mechanical parameters on the macro-mechanical response of short fiber composites, namely polypropylene matrix reinforced by glass fibers. In the framework of this paper, an attention has been given to the glass fibers length, as micromechanical parameter influences the overall macroscopic material’s behavior. Three dimensional numerical models were developed and analyzed through the concept of a Representative Volume Element (RVE). Results of the RVE-based approach were compared with analytical Halpin-Tsai’s model.Keywords: effective properties, homogenization, representative volume element, short fiber reinforced composites
Procedia PDF Downloads 2698246 “I” on the Web: Social Penetration Theory Revised
Authors: Dr. Dionysis Panos Dpt. Communication, Internet Studies Cyprus University of Technology
Abstract:
The widespread use of New Media and particularly Social Media, through fixed or mobile devices, has changed in a staggering way our perception about what is “intimate" and "safe" and what is not, in interpersonal communication and social relationships. The distribution of self and identity-related information in communication now evolves under new and different conditions and contexts. Consequently, this new framework forces us to rethink processes and mechanisms, such as what "exposure" means in interpersonal communication contexts, how the distinction between the "private" and the "public" nature of information is being negotiated online, how the "audiences" we interact with are understood and constructed. Drawing from an interdisciplinary perspective that combines sociology, communication psychology, media theory, new media and social networks research, as well as from the empirical findings of a longitudinal comparative research, this work proposes an integrative model for comprehending mechanisms of personal information management in interpersonal communication, which can be applied to both types of online (Computer-Mediated) and offline (Face-To-Face) communication. The presentation is based on conclusions drawn from a longitudinal qualitative research study with 458 new media users from 24 countries for almost over a decade. Some of these main conclusions include: (1) There is a clear and evidenced shift in users’ perception about the degree of "security" and "familiarity" of the Web, between the pre- and the post- Web 2.0 era. The role of Social Media in this shift was catalytic. (2) Basic Web 2.0 applications changed dramatically the nature of the Internet itself, transforming it from a place reserved for “elite users / technical knowledge keepers" into a place of "open sociability” for anyone. (3) Web 2.0 and Social Media brought about a significant change in the concept of “audience” we address in interpersonal communication. The previous "general and unknown audience" of personal home pages, converted into an "individual & personal" audience chosen by the user under various criteria. (4) The way we negotiate the nature of 'private' and 'public' of the Personal Information, has changed in a fundamental way. (5) The different features of the mediated environment of online communication and the critical changes occurred since the Web 2.0 advance, lead to the need of reconsideration and updating the theoretical models and analysis tools we use in our effort to comprehend the mechanisms of interpersonal communication and personal information management. Therefore, is proposed here a new model for understanding the way interpersonal communication evolves, based on a revision of social penetration theory.Keywords: new media, interpersonal communication, social penetration theory, communication exposure, private information, public information
Procedia PDF Downloads 3748245 Design and Evaluation on Sierpinski-Triangle Acoustic Diffusers Based on Fractal Theory
Authors: Lingge Tan, Hongpeng Xu, Jieun Yang, Maarten Hornikx
Abstract:
Acoustic diffusers are important components in enhancing the quality of room acoustics. This paper provides a type of modular diffuser based on the Sierpinski Triangle of the plane and combines it with fractal theory to expand the effective frequency range. In numerical calculations and full-scale model experiments, the effect of fractal design elements on normal-incidence diffusion coefficients is examined. It is demonstrated the reasonable times of iteration of modules is three, and the coverage density is 58.4% in the design frequency from 125Hz to 4kHz.Keywords: acoustic diffuser, fractal, Sierpinski-triangle, diffusion coefficient
Procedia PDF Downloads 1558244 The Effectiveness of Blended Learning in Pre-Registration Nurse Education: A Mixed Methods Systematic Review and Met Analysis
Authors: Albert Amagyei, Julia Carroll, Amanda R. Amorim Adegboye, Laura Strumidlo, Rosie Kneafsey
Abstract:
Introduction: Classroom-based learning has persisted as the mainstream model of pre-registration nurse education. This model is often rigid, teacher-centered, and unable to support active learning and the practical learning needs of nursing students. Health Education England (HEE), a public body of the Department of Health and Social Care, hypothesises that blended learning (BL) programmes may address health system and nursing profession challenges, such as nursing shortages and lack of digital expertise, by exploring opportunities for providing predominantly online, remote-access study which may increase nursing student recruitment, offering alternate pathways to nursing other than the traditional classroom route. This study will provide evidence for blended learning strategies adopted in nursing education as well as examine nursing student learning experiences concerning the challenges and opportunities related to using blended learning within nursing education. Objective: This review will explore the challenges and opportunities of BL within pre-registration nurse education from the student's perspective. Methods: The search was completed within five databases. Eligible studies were appraised independently by four reviewers. The JBI-convergent segregated approach for mixed methods review was used to assess and synthesize the data. The study’s protocol has been registered with the International Register of Systematic Reviews with registration number// PROSPERO (CRD42023423532). Results: Twenty-seven (27) studies (21 quantitative and 6 qualitative) were included in the review. The study confirmed that BL positively impacts nursing students' learning outcomes, as demonstrated by the findings of the meta-analysis and meta-synthesis. Conclusion: The review compared BL to traditional learning, simulation, laboratory, and online learning on nursing students’ learning and programme outcomes as well as learning behaviour and experience. The results show that BL could effectively improve nursing students’ knowledge, academic achievement, critical skills, and clinical performance as well as enhance learner satisfaction and programme retention. The review findings outline that students’ background characteristics, BL design, and format significantly impact the success of the BL nursing programme.Keywords: nursing student, blended learning, pre-registration nurse education, online learning
Procedia PDF Downloads 548243 Population Pharmacokinetics of Levofloxacin and Moxifloxacin, and the Probability of Target Attainment in Ethiopian Patients with Multi-Drug Resistant Tuberculosis
Authors: Temesgen Sidamo, Prakruti S. Rao, Eleni Akllilu, Workineh Shibeshi, Yumi Park, Yong-Soon Cho, Jae-Gook Shin, Scott K. Heysell, Stellah G. Mpagama, Ephrem Engidawork
Abstract:
The fluoroquinolones (FQs) are used off-label for the treatment of multidrug-resistant tuberculosis (MDR-TB), and for evaluation in shortening the duration of drug-susceptible TB in recently prioritized regimens. Within the class, levofloxacin (LFX) and moxifloxacin (MXF) play a substantial role in ensuring success in treatment outcomes. However, sub-therapeutic plasma concentrations of either LFX or MXF may drive unfavorable treatment outcomes. To the best of our knowledge, the pharmacokinetics of LFX and MXF in Ethiopian patients with MDR-TB have not yet been investigated. Therefore, the aim of this study was to develop a population pharmacokinetic (PopPK) model of levofloxacin (LFX) and moxifloxacin (MXF) and assess the percent probability of target attainment (PTA) as defined by the ratio of the area under the plasma concentration-time curve over 24-h (AUC0-24) and the in vitro minimum inhibitory concentration (MIC) (AUC0-24/MIC) in Ethiopian MDR-TB patients. Steady-state plasma was collected from 39 MDR-TB patients enrolled in the programmatic treatment course and the drug concentrations were determined using optimized liquid chromatography-tandem mass spectrometry. In addition, the in vitro MIC of the patients' pretreatment clinical isolates was determined. PopPK and simulations were run at various doses, and PK parameters were estimated. The effect of covariates on the PK parameters and the PTA for maximum mycobacterial kill and resistance prevention was also investigated. LFX and MXF both fit in a one-compartment model with adjustments. The apparent volume of distribution (V) and clearance (CL) of LFX were influenced by serum creatinine (Scr), whereas the absorption constant (Ka) and V of MXF were influenced by Scr and BMI, respectively. The PTA for LFX maximal mycobacterial kill at the critical MIC of 0.5 mg/L was 29%, 62%, and 95% with the simulated 750 mg, 1000 mg, and 1500 mg doses, respectively, whereas the PTA for resistance prevention at 1500 mg was only 4.8%, with none of the lower doses achieving this target. At the critical MIC of 0.25 mg/L, there was no difference in the PTA (94.4%) for maximum bacterial kill among the simulated doses of MXF (600 mg, 800 mg, and 1000 mg), but the PTA for resistance prevention improved proportionately with dose. Standard LFX and MXF doses may not provide adequate drug exposure. LFX PopPK is more predictable for maximum mycobacterial kill, whereas MXF's resistance prevention target increases with dose. Scr and BMI are likely to be important covariates in dose optimization or therapeutic drug monitoring (TDM) studies in Ethiopian patients.Keywords: population PK, PTA, moxifloxacin, levofloxacin, MDR-TB patients, ethiopia
Procedia PDF Downloads 1218242 Efficient Modeling Technique for Microstrip Discontinuities
Authors: Nassim Ourabia, Malika Ourabia
Abstract:
A new and efficient method is presented for the analysis of arbitrarily shaped discontinuities. The technique obtains closed form expressions for the equivalent circuits which are used to model these discontinuities. Then it would be easy to handle and to characterize complicated structures like T and Y junctions, truncated junctions, arbitrarily shaped junctions, cascading junctions, and more generally planar multiport junctions. Another advantage of this method is that the edge line concept for arbitrary shape junctions operates with real parameters circuits. The validity of the method was further confirmed by comparing our results for various discontinuities (bend, filters) with those from HFSS as well as from other published sources.Keywords: CAD analysis, contour integral approach, microwave circuits, s-parameters
Procedia PDF Downloads 5168241 Resilient Leadership: An Analysis for Challenges, Transformation and Improvement of Organizational Climate in Gastronomic Companies
Authors: Margarita Santi Becerra Santiago
Abstract:
The following document addresses the descriptive analysis under the qualitative approach of resilient leadership that allows us to know the importance of the application of a new leadership model to face the new challenges within the gastronomic companies in Mexico. Likewise, to know the main factors that influence resilient leaders and companies to develop new skills to elaborate strategies that contribute to overcoming adversities and managing change. Adversities in a company always exist and challenge us to move and apply our knowledge to be competitive as well as to strengthen our work team through motivation to achieve efficiency and develop in a good organizational climate.Keywords: challenges, efficiency, leadership, resilience skills
Procedia PDF Downloads 778240 Economics of Precision Mechanization in Wine and Table Grape Production
Authors: Dean A. McCorkle, Ed W. Hellman, Rebekka M. Dudensing, Dan D. Hanselka
Abstract:
The motivation for this study centers on the labor- and cost-intensive nature of wine and table grape production in the U.S., and the potential opportunities for precision mechanization using robotics to augment those production tasks that are labor-intensive. The objectives of this study are to evaluate the economic viability of grape production in five U.S. states under current operating conditions, identify common production challenges and tasks that could be augmented with new technology, and quantify a maximum price for new technology that growers would be able to pay. Wine and table grape production is primed for precision mechanization technology as it faces a variety of production and labor issues. Methodology: Using a grower panel process, this project includes the development of a representative wine grape vineyard in five states and a representative table grape vineyard in California. The panels provided production, budget, and financial-related information that are typical for vineyards in their area. Labor costs for various production tasks are of particular interest. Using the data from the representative budget, 10-year projected financial statements have been developed for the representative vineyard and evaluated using a stochastic simulation model approach. Labor costs for selected vineyard production tasks were evaluated for the potential of new precision mechanization technology being developed. These tasks were selected based on a variety of factors, including input from the panel members, and the extent to which the development of new technology was deemed to be feasible. The net present value (NPV) of the labor cost over seven years for each production task was derived. This allowed for the calculation of a maximum price for new technology whereby the NPV of labor costs would equal the NPV of purchasing, owning, and operating new technology. Expected Results: The results from the stochastic model will show the projected financial health of each representative vineyard over the 2015-2024 timeframe. Investigators have developed a preliminary list of production tasks that have the potential for precision mechanization. For each task, the labor requirements, labor costs, and the maximum price for new technology will be presented and discussed. Together, these results will allow technology developers to focus and prioritize their research and development efforts for wine and table grape vineyards, and suggest opportunities to strengthen vineyard profitability and long-term viability using precision mechanization.Keywords: net present value, robotic technology, stochastic simulation, wine and table grapes
Procedia PDF Downloads 2618239 Land Cover Mapping Using Sentinel-2, Landsat-8 Satellite Images, and Google Earth Engine: A Study Case of the Beterou Catchment
Authors: Ella Sèdé Maforikan
Abstract:
Accurate land cover mapping is essential for effective environmental monitoring and natural resources management. This study focuses on assessing the classification performance of two satellite datasets and evaluating the impact of different input feature combinations on classification accuracy in the Beterou catchment, situated in the northern part of Benin. Landsat-8 and Sentinel-2 images from June 1, 2020, to March 31, 2021, were utilized. Employing the Random Forest (RF) algorithm on Google Earth Engine (GEE), a supervised classification categorized the land into five classes: forest, savannas, cropland, settlement, and water bodies. GEE was chosen due to its high-performance computing capabilities, mitigating computational burdens associated with traditional land cover classification methods. By eliminating the need for individual satellite image downloads and providing access to an extensive archive of remote sensing data, GEE facilitated efficient model training on remote sensing data. The study achieved commendable overall accuracy (OA), ranging from 84% to 85%, even without incorporating spectral indices and terrain metrics into the model. Notably, the inclusion of additional input sources, specifically terrain features like slope and elevation, enhanced classification accuracy. The highest accuracy was achieved with Sentinel-2 (OA = 91%, Kappa = 0.88), slightly surpassing Landsat-8 (OA = 90%, Kappa = 0.87). This underscores the significance of combining diverse input sources for optimal accuracy in land cover mapping. The methodology presented herein not only enables the creation of precise, expeditious land cover maps but also demonstrates the prowess of cloud computing through GEE for large-scale land cover mapping with remarkable accuracy. The study emphasizes the synergy of different input sources to achieve superior accuracy. As a future recommendation, the application of Light Detection and Ranging (LiDAR) technology is proposed to enhance vegetation type differentiation in the Beterou catchment. Additionally, a cross-comparison between Sentinel-2 and Landsat-8 for assessing long-term land cover changes is suggested.Keywords: land cover mapping, Google Earth Engine, random forest, Beterou catchment
Procedia PDF Downloads 648238 Health Status Monitoring of COVID-19 Patient's through Blood Tests and Naïve-Bayes
Authors: Carlos Arias-Alcaide, Cristina Soguero-Ruiz, Paloma Santos-Álvarez, Adrián García-Romero, Inmaculada Mora-Jiménez
Abstract:
Analysing clinical data with computers in such a way that have an impact on the practitioners’ workflow is a challenge nowadays. This paper provides a first approach for monitoring the health status of COVID-19 patients through the use of some biomarkers (blood tests) and the simplest Naïve Bayes classifier. Data of two Spanish hospitals were considered, showing the potential of our approach to estimate reasonable posterior probabilities even some days before the event.Keywords: Bayesian model, blood biomarkers, classification, health tracing, machine learning, posterior probability
Procedia PDF Downloads 2348237 Improved Skin Detection Using Colour Space and Texture
Authors: Medjram Sofiane, Babahenini Mohamed Chaouki, Mohamed Benali Yamina
Abstract:
Skin detection is an important task for computer vision systems. A good method for skin detection means a good and successful result of the system. The colour is a good descriptor that allows us to detect skin colour in the images, but because of lightings effects and objects that have a similar colour skin, skin detection becomes difficult. In this paper, we proposed a method using the YCbCr colour space for skin detection and lighting effects elimination, then we use the information of texture to eliminate the false regions detected by the YCbCr colour skin model.Keywords: skin detection, YCbCr, GLCM, texture, human skin
Procedia PDF Downloads 4598236 Modelling and Simulation of Photovoltaic Cell
Authors: Fouad Berrabeh, Sabir Messalti
Abstract:
The performances of the photovoltaic systems are very dependent on different conditions, such as solar irradiation, temperature, etc. Therefore, it is very important to provide detailed studies for different cases in order to provide continuously power, so the photovoltaic system must be properly sized. This paper presents the modelling and simulation of the photovoltaic cell using single diode model. I-V characteristics and P-V characteristics are presented and it verified at different conditions (irradiance effect, temperature effect, series resistance effect).Keywords: photovoltaic cell, BP SX 150 BP solar photovoltaic module, irradiance effect, temperature effect, series resistance effect, I–V characteristics, P–V characteristics
Procedia PDF Downloads 4908235 Second Sub-Harmonic Resonance in Vortex-Induced Vibrations of a Marine Pipeline Close to the Seabed
Authors: Yiming Jin, Yuanhao Gao
Abstract:
In this paper, using the method of multiple scales, the second sub-harmonic resonance in vortex-induced vibrations (VIV) of a marine pipeline close to the seabed is investigated based on a developed wake oscillator model. The amplitude-frequency equations are also derived. It is found that the oscillation will increase all the time when both discriminants of the amplitude-frequency equations are positive while the oscillation will decay when the discriminants are negative.Keywords: vortex-induced vibrations, marine pipeline, seabed, sub-harmonic resonance
Procedia PDF Downloads 3328234 Pre-Operative Psychological Factors Significantly Add to the Predictability of Chronic Narcotic Use: A Two Year Prospective Study
Authors: Dana El-Mughayyar, Neil Manson, Erin Bigney, Eden Richardson, Dean Tripp, Edward Abraham
Abstract:
Use of narcotics to treat pain has increased over the past two decades and is a contributing factor to the current public health crisis. Understanding the pre-operative risks of chronic narcotic use may be aided through investigation of psychological measures. The objective of the reported study is to determine predictors of narcotic use two years post-surgery in a thoracolumbar spine surgery population, including an array of psychological factors. A prospective observational study of 191 consecutively enrolled adult patients having undergone thoracolumbar spine surgery is presented. Baseline measures of interest included the Pain Catastrophizing Scale (PCS), Tampa Scale for Kinesiophobia, Multidimensional Scale for Perceived Social Support (MSPSS), Chronic Pain Acceptance Questionnaire (CPAQ-8), Oswestry Disability Index (ODI), Numeric Rating Scales for back and leg pain (NRS-B/L), SF-12’s Mental Component Summary (MCS), narcotic use and demographic variables. The post-operative measure of interest is narcotic use at 2-year follow-up. Narcotic use is collapsed into binary categories of use and no use. Descriptive statistics are run. Chi Square analysis is used for categorical variables and an ANOVA for continuous variables. Significant variables are built into a hierarchical logistic regression to determine predictors of post-operative narcotic use. Significance is set at α < 0.05. Results: A total of 27.23% of the sample were using narcotics two years after surgery. The regression model included ODI, NRS-Leg, time with condition, chief complaint, pre-operative drug use, gender, MCS, PCS subscale helplessness, and CPAQ subscale pain willingness and was significant χ² (13, N=191)= 54.99; p = .000. The model accounted for 39.6% of the variance in narcotic use and correctly predicted in 79.7% of cases. Psychological variables accounted for 9.6% of the variance over and above the other predictors. Conclusions: Managing chronic narcotic usage is central to the patient’s overall health and quality of life. Psychological factors in the preoperative period are significant predictors of narcotic use 2 years post-operatively. The psychological variables are malleable, potentially allowing surgeons to direct their patients to preventative resources prior to surgery.Keywords: narcotics, psychological factors, quality of life, spine surgery
Procedia PDF Downloads 1458233 A Comprehensive Study of Spread Models of Wildland Fires
Authors: Manavjit Singh Dhindsa, Ursula Das, Kshirasagar Naik, Marzia Zaman, Richard Purcell, Srinivas Sampalli, Abdul Mutakabbir, Chung-Horng Lung, Thambirajah Ravichandran
Abstract:
These days, wildland fires, also known as forest fires, are more prevalent than ever. Wildfires have major repercussions that affect ecosystems, communities, and the environment in several ways. Wildfires lead to habitat destruction and biodiversity loss, affecting ecosystems and causing soil erosion. They also contribute to poor air quality by releasing smoke and pollutants that pose health risks, especially for individuals with respiratory conditions. Wildfires can damage infrastructure, disrupt communities, and cause economic losses. The economic impact of firefighting efforts, combined with their direct effects on forestry and agriculture, causes significant financial difficulties for the areas impacted. This research explores different forest fire spread models and presents a comprehensive review of various techniques and methodologies used in the field. A forest fire spread model is a computational or mathematical representation that is used to simulate and predict the behavior of a forest fire. By applying scientific concepts and data from empirical studies, these models attempt to capture the intricate dynamics of how a fire spreads, taking into consideration a variety of factors like weather patterns, topography, fuel types, and environmental conditions. These models assist authorities in understanding and forecasting the potential trajectory and intensity of a wildfire. Emphasizing the need for a comprehensive understanding of wildfire dynamics, this research explores the approaches, assumptions, and findings derived from various models. By using a comparison approach, a critical analysis is provided by identifying patterns, strengths, and weaknesses among these models. The purpose of the survey is to further wildfire research and management techniques. Decision-makers, researchers, and practitioners can benefit from the useful insights that are provided by synthesizing established information. Fire spread models provide insights into potential fire behavior, facilitating authorities to make informed decisions about evacuation activities, allocating resources for fire-fighting efforts, and planning for preventive actions. Wildfire spread models are also useful in post-wildfire mitigation strategies as they help in assessing the fire's severity, determining high-risk regions for post-fire dangers, and forecasting soil erosion trends. The analysis highlights the importance of customized modeling approaches for various circumstances and promotes our understanding of the way forest fires spread. Some of the known models in this field are Rothermel’s wildland fuel model, FARSITE, WRF-SFIRE, FIRETEC, FlamMap, FSPro, cellular automata model, and others. The key characteristics that these models consider include weather (includes factors such as wind speed and direction), topography (includes factors like landscape elevation), and fuel availability (includes factors like types of vegetation) among other factors. The models discussed are physics-based, data-driven, or hybrid models, also utilizing ML techniques like attention-based neural networks to enhance the performance of the model. In order to lessen the destructive effects of forest fires, this initiative aims to promote the development of more precise prediction tools and effective management techniques. The survey expands its scope to address the practical needs of numerous stakeholders. Access to enhanced early warning systems enables decision-makers to take prompt action. Emergency responders benefit from improved resource allocation strategies, strengthening the efficacy of firefighting efforts.Keywords: artificial intelligence, deep learning, forest fire management, fire risk assessment, fire simulation, machine learning, remote sensing, wildfire modeling
Procedia PDF Downloads 828232 Influence of Titanium Oxide on Crystallization, Microstructure and Mechanical Behavior of Barium Fluormica Glass-Ceramics
Authors: Amit Mallik, Anil K. Barik, Biswajit Pal
Abstract:
The galloping advancement of research work on glass-ceramics stems from their wide applications in electronic industry and also to some extent in application oriented medical dentistry. TiO2, even in low concentration has been found to strongly influence the physical and mechanical properties of the glasses. Glass-ceramics is a polycrystalline ceramic material produced through controlled crystallization of glasses. Crystallization is accomplished by subjecting the suitable parent glasses to a regulated heat treatment involving the nucleation and growth of crystal phases in the glass. Mica glass-ceramics is a new kind of glass-ceramics based on the system SiO2•MgO•K2O•F. The predominant crystalline phase is synthetic fluormica, named fluorophlogopite. Mica containing glass-ceramics flaunt an exceptional feature of machinability apart from their unique thermal and chemical properties. Machinability arises from the randomly oriented mica crystals with a 'house of cards' microstructures allowing cracks to propagate readily along the mica plane but hindering crack propagation across the layers. In the present study, we have systematically investigated the crystallization, microstructure and mechanical behavior of barium fluorophlogopite mica-containing glass-ceramics of composition BaO•4MgO•Al2O3•6SiO2•2MgF2 nucleated by addition of 2, 4, 6 and 8 wt% TiO2. The glass samples were prepared by the melting technique. After annealing, different batches of glass samples for nucleation were fired at 730°C (2wt% TiO2), 720°C (4 wt% TiO2), 710°C (6 wt% TiO2) and 700°C (8 wt% TiO2) batches respectively for 2 h and ultimately heated to corresponding crystallization temperatures. The glass batches were analyzed by differential thermal analysis (DTA) and x-ray diffraction (XRD), scanning electron microscopy (SEM) and micro hardness indenter. From the DTA study, it is found that the fluorophlogopite mica crystallization exotherm appeared in the temperature range 886–903°C. Glass transition temperature (Tg) and crystallization peak temperature (Tp) increased with increasing TiO2 content up to 4 wt% beyond this weight% the glass transition temperature (Tg) and crystallization peak temperature (Tp) start to decrease with increasing TiO2 content up to 8 wt%. Scanning electron microscopy confirms the development of an interconnected ‘house of cards’ microstructure promoted by TiO2 as a nucleating agent. The increase in TiO2 content decreases the vicker’s hardness values in glass-ceramics.Keywords: crystallization, fluormica glass, ‘house of cards’ microstructure, hardness
Procedia PDF Downloads 2408231 Understanding the Effect of Material and Deformation Conditions on the “Wear Mode Diagram”: A Numerical Study
Authors: A. Mostaani, M. P. Pereira, B. F. Rolfe
Abstract:
The increasing application of Advanced High Strength Steel (AHSS) in the automotive industry to fulfill crash requirements has introduced higher levels of wear in stamping dies and parts. Therefore, understanding wear behaviour in sheet metal forming is of great importance as it can help to reduce the high costs currently associated with tool wear. At the contact between the die and the sheet, the tips of hard tool asperities interact with the softer sheet material. Understanding the deformation that occurs during this interaction is important for our overall understanding of the wear mechanisms. For these reasons, the scratching of a perfectly plastic material by a rigid indenter has been widely examined in the literature; with finite element modelling (FEM) used in recent years to further understand the behaviour. The ‘wear mode diagram’ has been commonly used to classify the deformation regime of the soft work-piece during scratching, into three modes: ploughing, wedge formation, and cutting. This diagram, which is based on 2D slip line theory and upper bound method for perfectly plastic work-piece and rigid indenter, relates different wear modes to attack angle and interfacial strength. This diagram has been the basis for many wear studies and wear models to date. Additionally, it has been concluded that galling is most likely to occur during the wedge formation mode. However, there has been little analysis in the literature of how the material behaviour and deformation conditions associated with metal forming processes influence the wear behaviour. Therefore, the first aim of this work is first to use a commercial FEM package (Abaqus/Explicit) to build a 3D model to capture wear modes during scratching with indenters with different attack angles and different interfacial strengths. The second goal is to utilise the developed model to understand how wear modes might change in the presence of bulk deformation of the work-piece material as a result of the metal forming operation. Finally, the effect of the work-piece material properties, including strain hardening, will be examined to understand how these influence the wear modes and wear behaviour. The results show that both strain hardening and substrate deformation can change the critical attack angle at which the wedge formation regime is activated.Keywords: finite element, pile-up, scratch test, wear mode
Procedia PDF Downloads 3298230 Storm-Runoff Simulation Approaches for External Natural Catchments of Urban Sewer Systems
Authors: Joachim F. Sartor
Abstract:
According to German guidelines, external natural catchments are greater sub-catchments without significant portions of impervious areas, which possess a surface drainage system and empty in a sewer network. Basically, such catchments should be disconnected from sewer networks, particularly from combined systems. If this is not possible due to local conditions, their flow hydrographs have to be considered at the design of sewer systems, because the impact may be significant. Since there is a lack of sufficient measurements of storm-runoff events for such catchments and hence verified simulation methods to analyze their design flows, German standards give only general advices and demands special considerations in such cases. Compared to urban sub-catchments, external natural catchments exhibit greatly different flow characteristics. With increasing area size their hydrological behavior approximates that of rural catchments, e.g. sub-surface flow may prevail and lag times are comparable long. There are few observed peak flow values and simple (mostly empirical) approaches that are offered by literature for Central Europe. Most of them are at least helpful to crosscheck results that are achieved by simulation lacking calibration. Using storm-runoff data from five monitored rural watersheds in the west of Germany with catchment areas between 0.33 and 1.07 km2 , the author investigated by multiple event simulation three different approaches to determine the rainfall excess. These are the modified SCS variable run-off coefficient methods by Lutz and Zaiß as well as the soil moisture model by Ostrowski. Selection criteria for storm events from continuous precipitation data were taken from recommendations of M 165 and the runoff concentration method (parallel cascades of linear reservoirs) from a DWA working report to which the author had contributed. In general, the two run-off coefficient methods showed results that are of sufficient accuracy for most practical purposes. The soil moisture model showed no significant better results, at least not to such a degree that it would justify the additional data collection that its parameter determination requires. Particularly typical convective summer events after long dry periods, that are often decisive for sewer networks (not so much for rivers), showed discrepancies between simulated and measured flow hydrographs.Keywords: external natural catchments, sewer network design, storm-runoff modelling, urban drainage
Procedia PDF Downloads 1538229 Developing a Decision-Making Tool for Prioritizing Green Building Initiatives
Authors: Tayyab Ahmad, Gerard Healey
Abstract:
Sustainability in built environment sector is subject to many development constraints. Building projects are developed under different requirements of deliverables which makes each project unique. For an owner organization, i.e., a higher-education institution, involved in a significant building stock, it is important to prioritize some of the sustainability initiatives over the others in order to align the sustainable building development with organizational goals. The point-based green building rating tools i.e. Green Star, LEED, BREEAM are becoming increasingly popular and are well-acknowledged worldwide for verifying a sustainable development. It is imperative to synthesize a multi-criteria decision-making tool that can capitalize on the point-based methodology of rating systems while customizing the sustainable development of building projects according to the individual requirements and constraints of the client organization. A multi-criteria decision-making tool for the University of Melbourne is developed that builds on the action-learning and experience of implementing Green Buildings at the University of Melbourne. The tool evaluates the different sustainable building initiatives based on the framework of Green Star rating tool of Green Building Council of Australia. For each different sustainability initiative the decision-making tool makes an assessment based on at least five performance criteria including the ease with which a sustainability initiative can be achieved and the potential of a sustainability initiative to enhance project objectives, reduce life-cycle costs, enhance University’s reputation, and increase the confidence in quality construction. The use of a weighted aggregation mathematical model in the proposed tool can have a considerable role in the decision-making process of a Green Building project by indexing the Green Building initiatives in terms of organizational priorities. The index value of each initiative will be based on its alignment with some of the key performance criteria. The usefulness of the decision-making tool is validated by conducting structured interviews with some of the key stakeholders involved in the development of sustainable building projects at the University of Melbourne. The proposed tool is realized to help a client organization in deciding that within limited resources which sustainability initiatives and practices are more important to be pursued than others.Keywords: higher education institution, multi-criteria decision-making tool, organizational values, prioritizing sustainability initiatives, weighted aggregation model
Procedia PDF Downloads 2348228 Knowledge Sharing and Organizational Performance: A System Dynamics Approach
Authors: Shachi Pathak
Abstract:
We are living in knowledge based economy where firms can gain competitive advantage with the help of managing knowledge within the organization. The purpose the study is to develop a conceptual model to explain the relationship between factors affecting knowledge sharing, called as knowledge enablers, in an organization, knowledge sharing activities and organizational performance, using system dynamics approach. This research is important since it will provide better understandings on what are the key knowledge enablers to support knowledge sharing activities, and how knowledge sharing activities will affect the capability of an organization to enhance the performance of the organization.Keywords: knowledge management, knowledge sharing, organizational performance, system dynamics
Procedia PDF Downloads 376