Search results for: measurement models
1168 Aerial Photogrammetry-Based Techniques to Rebuild the 30-Years Landform Changes of a Landslide-Dominated Watershed in Taiwan
Authors: Yichin Chen
Abstract:
Taiwan is an island characterized by an active tectonics and high erosion rates. Monitoring the dynamic landscape of Taiwan is an important issue for disaster mitigation, geomorphological research, and watershed management. Long-term and high spatiotemporal landform data is essential for quantifying and simulating the geomorphological processes and developing warning systems. Recently, the advances in unmanned aerial vehicle (UAV) and computational photogrammetry technology have provided an effective way to rebuild and monitor the topography changes in high spatio-temporal resolutions. This study rebuilds the 30-years landform change in the Aiyuzi watershed in 1986-2017 by using the aerial photogrammetry-based techniques. The Aiyuzi watershed, located in central Taiwan and has an area of 3.99 Km², is famous for its frequent landslide and debris flow disasters. This study took the aerial photos by using UAV and collected multi-temporal historical, stereo photographs, taken by the Aerial Survey Office of Taiwan’s Forestry Bureau. To rebuild the orthoimages and digital surface models (DSMs), Pix4DMapper, a photogrammetry software, was used. Furthermore, to control model accuracy, a set of ground control points was surveyed by using eGPS. The results show that the generated DSMs have the ground sampling distance (GSD) of ~10 cm and ~0.3 cm from the UAV’s and historical photographs, respectively, and vertical error of ~1 m. By comparing the DSMs, there are many deep-seated landslides (with depth over 20 m) occurred on the upstream in the Aiyuzi watershed. Even though a large amount of sediment is delivered from the landslides, the steep main channel has sufficient capacity to transport sediment from the channel and to erode the river bed to ~20 m in depth. Most sediments are transported to the outlet of watershed and deposits on the downstream channel. This case study shows that UAV and photogrammetry technology are useful for topography change monitoring effectively.Keywords: aerial photogrammetry, landslide, landform change, Taiwan
Procedia PDF Downloads 1571167 Lateral Torsional Buckling: Tests on Glued Laminated Timber Beams
Authors: Vera Wilden, Benno Hoffmeister, Markus Feldmann
Abstract:
Glued laminated timber (glulam) is a preferred choice for long span girders, e.g., for gyms or storage halls. While the material provides sufficient strength to resist the bending moments, large spans lead to increased slenderness of such members and to a higher susceptibility to stability issues, in particular to lateral torsional buckling (LTB). Rules for the determination of the ultimate LTB resistance are provided by Eurocode 5. The verifications of the resistance may be performed using the so called equivalent member method or by means of theory 2nd order calculations (direct method), considering equivalent imperfections. Both methods have significant limitations concerning their applicability; the equivalent member method is limited to rather simple cases; the direct method is missing detailed provisions regarding imperfections and requirements for numerical modeling. In this paper, the results of a test series on slender glulam beams in three- and four-point bending are presented. The tests were performed in an innovative, newly developed testing rig, allowing for a very precise definition of loading and boundary conditions. The load was introduced by a hydraulic jack, which follows the lateral deformation of the beam by means of a servo-controller, coupled with the tested member and keeping the load direction vertically. The deformation-controlled tests allowed for the identification of the ultimate limit state (governed by elastic stability) and the corresponding deformations. Prior to the tests, the structural and geometrical imperfections were determined and used later in the numerical models. After the stability tests, the nearly undamaged members were tested again in pure bending until reaching the ultimate moment resistance of the cross-section. These results, accompanied by numerical studies, were compared to resistance values obtained using both methods according to Eurocode 5.Keywords: experimental tests, glued laminated timber, lateral torsional buckling, numerical simulation
Procedia PDF Downloads 2381166 Comparison of Multivariate Adaptive Regression Splines and Random Forest Regression in Predicting Forced Expiratory Volume in One Second
Authors: P. V. Pramila , V. Mahesh
Abstract:
Pulmonary Function Tests are important non-invasive diagnostic tests to assess respiratory impairments and provides quantifiable measures of lung function. Spirometry is the most frequently used measure of lung function and plays an essential role in the diagnosis and management of pulmonary diseases. However, the test requires considerable patient effort and cooperation, markedly related to the age of patients esulting in incomplete data sets. This paper presents, a nonlinear model built using Multivariate adaptive regression splines and Random forest regression model to predict the missing spirometric features. Random forest based feature selection is used to enhance both the generalization capability and the model interpretability. In the present study, flow-volume data are recorded for N= 198 subjects. The ranked order of feature importance index calculated by the random forests model shows that the spirometric features FVC, FEF 25, PEF,FEF 25-75, FEF50, and the demographic parameter height are the important descriptors. A comparison of performance assessment of both models prove that, the prediction ability of MARS with the `top two ranked features namely the FVC and FEF 25 is higher, yielding a model fit of R2= 0.96 and R2= 0.99 for normal and abnormal subjects. The Root Mean Square Error analysis of the RF model and the MARS model also shows that the latter is capable of predicting the missing values of FEV1 with a notably lower error value of 0.0191 (normal subjects) and 0.0106 (abnormal subjects). It is concluded that combining feature selection with a prediction model provides a minimum subset of predominant features to train the model, yielding better prediction performance. This analysis can assist clinicians with a intelligence support system in the medical diagnosis and improvement of clinical care.Keywords: FEV, multivariate adaptive regression splines pulmonary function test, random forest
Procedia PDF Downloads 3101165 Assessing Carbon Stock and Sequestration of Reforestation Species on Old Mining Sites in Morocco Using the DNDC Model
Authors: Nabil Elkhatri, Mohamed Louay Metougui, Ngonidzashe Chirinda
Abstract:
Mining activities have left a legacy of degraded landscapes, prompting urgent efforts for ecological restoration. Reforestation holds promise as a potent tool to rehabilitate these old mining sites, with the potential to sequester carbon and contribute to climate change mitigation. This study focuses on evaluating the carbon stock and sequestration potential of reforestation species in the context of Morocco's mining areas, employing the DeNitrification-DeComposition (DNDC) model. The research is grounded in recognizing the need to connect theoretical models with practical implementation, ensuring that reforestation efforts are informed by accurate and context-specific data. Field data collection encompasses growth patterns, biomass accumulation, and carbon sequestration rates, establishing an empirical foundation for the study's analyses. By integrating the collected data with the DNDC model, the study aims to provide a comprehensive understanding of carbon dynamics within reforested ecosystems on old mining sites. The major findings reveal varying sequestration rates among different reforestation species, indicating the potential for species-specific optimization of reforestation strategies to enhance carbon capture. This research's significance lies in its potential to contribute to sustainable land management practices and climate change mitigation strategies. By quantifying the carbon stock and sequestration potential of reforestation species, the study serves as a valuable resource for policymakers, land managers, and practitioners involved in ecological restoration and carbon management. Ultimately, the study aligns with global objectives to rejuvenate degraded landscapes while addressing pressing climate challenges.Keywords: carbon stock, carbon sequestration, DNDC model, ecological restoration, mining sites, Morocco, reforestation, sustainable land management.
Procedia PDF Downloads 761164 Artificial Intelligence Impact on Strategic Stability
Authors: Darius Jakimavicius
Abstract:
Artificial intelligence is the subject of intense debate in the international arena, identified both as a technological breakthrough and as a component of the strategic stability effect. Both the kinetic and non-kinetic development of AI and its application in the national strategies of the great powers may trigger a change in the security situation. Artificial intelligence is generally faster, more capable and more efficient than humans, and there is a temptation to transfer decision-making and control responsibilities to artificial intelligence. Artificial intelligence, which, once activated, can select and act on targets without further intervention by a human operator, blurs the boundary between human or robot (machine) warfare, or perhaps human and robot together. Artificial intelligence acts as a force multiplier that speeds up decision-making and reaction times on the battlefield. The role of humans is increasingly moving away from direct decision-making and away from command and control processes involving the use of force. It is worth noting that the autonomy and precision of AI systems make the process of strategic stability more complex. Deterrence theory is currently in a phase of development in which deterrence is undergoing further strain and crisis due to the complexity of the evolving models enabled by artificial intelligence. Based on the concept of strategic stability and deterrence theory, it is appropriate to develop further research on the development and impact of AI in order to assess AI from both a scientific and technical perspective: to capture a new niche in the scientific literature and academic terminology, to clarify the conditions for deterrence, and to identify the potential uses, impacts and possibly quantities of AI. The research problem is the impact of artificial intelligence developed by great powers on strategic stability. This thesis seeks to assess the impact of AI on strategic stability and deterrence principles, with human exclusion from the decision-making and control loop as a key axis. The interaction between AI and human actions and interests can determine fundamental changes in great powers' defense and deterrence, and the development and application of AI-based great powers strategies can lead to a change in strategic stability.Keywords: artificial inteligence, strategic stability, deterrence theory, decision making loop
Procedia PDF Downloads 421163 Identifying a Drug Addict Person Using Artificial Neural Networks
Authors: Mustafa Al Sukar, Azzam Sleit, Abdullatif Abu-Dalhoum, Bassam Al-Kasasbeh
Abstract:
Use and abuse of drugs by teens is very common and can have dangerous consequences. The drugs contribute to physical and sexual aggression such as assault or rape. Some teenagers regularly use drugs to compensate for depression, anxiety or a lack of positive social skills. Teen resort to smoking should not be minimized because it can be "gateway drugs" for other drugs (marijuana, cocaine, hallucinogens, inhalants, and heroin). The combination of teenagers' curiosity, risk taking behavior, and social pressure make it very difficult to say no. This leads most teenagers to the questions: "Will it hurt to try once?" Nowadays, technological advances are changing our lives very rapidly and adding a lot of technologies that help us to track the risk of drug abuse such as smart phones, Wireless Sensor Networks (WSNs), Internet of Things (IoT), etc. This technique may help us to early discovery of drug abuse in order to prevent an aggravation of the influence of drugs on the abuser. In this paper, we have developed a Decision Support System (DSS) for detecting the drug abuse using Artificial Neural Network (ANN); we used a Multilayer Perceptron (MLP) feed-forward neural network in developing the system. The input layer includes 50 variables while the output layer contains one neuron which indicates whether the person is a drug addict. An iterative process is used to determine the number of hidden layers and the number of neurons in each one. We used multiple experiment models that have been completed with Log-Sigmoid transfer function. Particularly, 10-fold cross validation schemes are used to access the generalization of the proposed system. The experiment results have obtained 98.42% classification accuracy for correct diagnosis in our system. The data had been taken from 184 cases in Jordan according to a set of questions compiled from Specialists, and data have been obtained through the families of drug abusers.Keywords: drug addiction, artificial neural networks, multilayer perceptron (MLP), decision support system
Procedia PDF Downloads 2991162 Syntax and Words as Evolutionary Characters in Comparative Linguistics
Authors: Nancy Retzlaff, Sarah J. Berkemer, Trudie Strauss
Abstract:
In the last couple of decades, the advent of digitalization of any kind of data was probably one of the major advances in all fields of study. This paves the way for also analysing these data even though they might come from disciplines where there was no initial computational necessity to do so. Especially in linguistics, one can find a rather manual tradition. Still when considering studies that involve the history of language families it is hard to overlook the striking similarities to bioinformatics (phylogenetic) approaches. Alignments of words are such a fairly well studied example of an application of bioinformatics methods to historical linguistics. In this paper we will not only consider alignments of strings, i.e., words in this case, but also alignments of syntax trees of selected Indo-European languages. Based on initial, crude alignments, a sophisticated scoring model is trained on both letters and syntactic features. The aim is to gain a better understanding on which features in two languages are related, i.e., most likely to have the same root. Initially, all words in two languages are pre-aligned with a basic scoring model that primarily selects consonants and adjusts them before fitting in the vowels. Mixture models are subsequently used to filter ‘good’ alignments depending on the alignment length and the number of inserted gaps. Using these selected word alignments it is possible to perform tree alignments of the given syntax trees and consequently find sentences that correspond rather well to each other across languages. The syntax alignments are then filtered for meaningful scores—’good’ scores contain evolutionary information and are therefore used to train the sophisticated scoring model. Further iterations of alignments and training steps are performed until the scoring model saturates, i.e., barely changes anymore. A better evaluation of the trained scoring model and its function in containing evolutionary meaningful information will be given. An assessment of sentence alignment compared to possible phrase structure will also be provided. The method described here may have its flaws because of limited prior information. This, however, may offer a good starting point to study languages where only little prior knowledge is available and a detailed, unbiased study is needed.Keywords: alignments, bioinformatics, comparative linguistics, historical linguistics, statistical methods
Procedia PDF Downloads 1541161 The Development of an Accident Causation Model Specific to Agriculture: The Irish Farm Accident Causation Model
Authors: Carolyn Scott, Rachel Nugent
Abstract:
The agricultural industry in Ireland and worldwide is one of the most dangerous occupations with respect to occupational health and safety accidents and fatalities. Many accident causation models have been developed in safety research to understand the underlying and contributory factors that lead to the occurrence of an accident. Due to the uniqueness of the agricultural sector, current accident causation theories cannot be applied. This paper presents an accident causation model named the Irish Farm Accident Causation Model (IFACM) which has been specifically tailored to the needs of Irish farms. The IFACM is a theoretical and practical model of accident causation that arranges the causal factors into a graphic representation of originating, shaping, and contributory factors that lead to accidents when unsafe acts and conditions are created that are not rectified by control measures. Causes of farm accidents were assimilated by means of a thorough literature review and were collated to form a graphical representation of the underlying causes of a farm accident. The IFACM was validated retrospectively through case study analysis and peer review. Participants in the case study (n=10) identified causes that led to a farm accident in which they were involved. A root cause analysis was conducted to understand the contributory factors surrounding the farm accident, traced back to the ‘root cause’. Experts relevant to farm safety accident causation in the agricultural industry have peer reviewed the IFACM. The accident causation process is complex. Accident prevention requires a comprehensive understanding of this complex process because to prevent the occurrence of accidents, the causes of accidents must be known. There is little research on the key causes and contributory factors of unsafe behaviours and accidents on Irish farms. The focus of this research is to gain a deep understanding of the causality of accidents on Irish farms. The results suggest that the IFACM framework is helpful for the analysis of the causes of accidents within the agricultural industry in Ireland. The research also suggests that there may be international applicability if further research is carried out. Furthermore, significant learning can be obtained from considering the underlying causes of accidents.Keywords: farm safety, farm accidents, accident causation, root cause analysis
Procedia PDF Downloads 781160 Model of Application of Blockchain Technology in Public Finances
Authors: M. Vlahovic
Abstract:
This paper presents a model of public finances, which combines three concepts: participatory budgeting, crowdfunding and blockchain technology. Participatory budgeting is defined as a process in which community members decide how to spend a part of community’s budget. Crowdfunding is a practice of funding a project by collecting small monetary contributions from a large number of people via an Internet platform. Blockchain technology is a distributed ledger that enables efficient and reliable transactions that are secure and transparent. In this hypothetical model, the government or authorities on local/regional level would set up a platform where they would propose public projects to citizens. Citizens would browse through projects and support or vote for those which they consider justified and necessary. In return, they would be entitled to a tax relief in the amount of their monetary contribution. Since the blockchain technology enables tracking of transactions, it can be used to mitigate corruption, money laundering and lack of transparency in public finances. Models of its application have already been created for e-voting, health records or land registries. By presenting a model of application of blockchain technology in public finances, this paper takes into consideration the potential of blockchain technology to disrupt governments and make processes more democratic, secure, transparent and efficient. The framework for this paper consists of multiple streams of research, including key concepts of direct democracy, public finance (especially the voluntary theory of public finance), information and communication technology, especially blockchain technology and crowdfunding. The framework defines rules of the game, basic conditions for the implementation of the model, benefits, potential problems and development perspectives. As an oversimplified map of a new form of public finances, the proposed model identifies primary factors, that influence the possibility of implementation of the model, and that could be tracked, measured and controlled in case of experimentation with the model.Keywords: blockchain technology, distributed ledger, participatory budgeting, crowdfunding, direct democracy, internet platform, e-government, public finance
Procedia PDF Downloads 1491159 Digital Holographic Interferometric Microscopy for the Testing of Micro-Optics
Authors: Varun Kumar, Chandra Shakher
Abstract:
Micro-optical components such as microlenses and microlens array have numerous engineering and industrial applications for collimation of laser diodes, imaging devices for sensor system (CCD/CMOS, document copier machines etc.), for making beam homogeneous for high power lasers, a critical component in Shack-Hartmann sensor, fiber optic coupling and optical switching in communication technology. Also micro-optical components have become an alternative for applications where miniaturization, reduction of alignment and packaging cost are necessary. The compliance with high-quality standards in the manufacturing of micro-optical components is a precondition to be compatible on worldwide markets. Therefore, high demands are put on quality assurance. For quality assurance of these lenses, an economical measurement technique is needed. For cost and time reason, technique should be fast, simple (for production reason), and robust with high resolution. The technique should provide non contact, non-invasive and full field information about the shape of micro- optical component under test. The interferometric techniques are noncontact type and non invasive and provide full field information about the shape of the optical components. The conventional interferometric technique such as holographic interferometry or Mach-Zehnder interferometry is available for characterization of micro-lenses. However, these techniques need more experimental efforts and are also time consuming. Digital holography (DH) overcomes the above described problems. Digital holographic microscopy (DHM) allows one to extract both the amplitude and phase information of a wavefront transmitted through the transparent object (microlens or microlens array) from a single recorded digital hologram by using numerical methods. Also one can reconstruct the complex object wavefront at different depths due to numerical reconstruction. Digital holography provides axial resolution in nanometer range while lateral resolution is limited by diffraction and the size of the sensor. In this paper, Mach-Zehnder based digital holographic interferometric microscope (DHIM) system is used for the testing of transparent microlenses. The advantage of using the DHIM is that the distortions due to aberrations in the optical system are avoided by the interferometric comparison of reconstructed phase with and without the object (microlens array). In the experiment, first a digital hologram is recorded in the absence of sample (microlens array) as a reference hologram. Second hologram is recorded in the presence of microlens array. The presence of transparent microlens array will induce a phase change in the transmitted laser light. Complex amplitude of object wavefront in presence and absence of microlens array is reconstructed by using Fresnel reconstruction method. From the reconstructed complex amplitude, one can evaluate the phase of object wave in presence and absence of microlens array. Phase difference between the two states of object wave will provide the information about the optical path length change due to the shape of the microlens. By the knowledge of the value of the refractive index of microlens array material and air, the surface profile of microlens array is evaluated. The Sag of microlens and radius of curvature of microlens are evaluated and reported. The sag of microlens agrees well within the experimental limit as provided in the specification by the manufacturer.Keywords: micro-optics, microlens array, phase map, digital holographic interferometric microscopy
Procedia PDF Downloads 4991158 BiFormerDTA: Structural Embedding of Protein in Drug Target Affinity Prediction Using BiFormer
Authors: Leila Baghaarabani, Parvin Razzaghi, Mennatolla Magdy Mostafa, Ahmad Albaqsami, Al Warith Al Rushaidi, Masoud Al Rawahi
Abstract:
Predicting the interaction between drugs and their molecular targets is pivotal for advancing drug development processes. Due to the time and cost limitations, computational approaches have emerged as an effective approach to drug-target interaction (DTI) prediction. Most of the introduced computational based approaches utilize the drug molecule and protein sequence as input. This study does not only utilize these inputs, it also introduces a protein representation developed using a masked protein language model. In this representation, for every individual amino acid residue within the protein sequence, there exists a corresponding probability distribution that indicates the likelihood of each amino acid being present at that particular position. Then, the similarity between each pair of amino-acids is computed to create similarity matrix. To encode the knowledge of the similarity matrix, Bi-Level Routing Attention (BiFormer) is utilized, which combines aspects of transformer-based models with protein sequence analysis and represents a significant advancement in the field of drug-protein interaction prediction. BiFormer has the ability to pinpoint the most effective regions of the protein sequence that are responsible for facilitating interactions between the protein and drugs, thereby enhancing the understanding of these critical interactions. Thus, it appears promising in its ability to capture the local structural relationship of the proteins by enhancing the understanding of how it contributes to drug protein interactions, thereby facilitating more accurate predictions. To evaluate the proposed method, it was tested on two widely recognized datasets: Davis and KIBA. A comprehensive series of experiments was conducted to illustrate its effectiveness in comparison to cuttingedge techniques.Keywords: BiFormer, transformer, protein language processing, self-attention mechanism, binding affinity, drug target interaction, similarity matrix, protein masked representation, protein language model
Procedia PDF Downloads 91157 Predicting the Turbulence Intensity, Excess Energy Available and Potential Power Generated by Building Mounted Wind Turbines over Four Major UK City
Authors: Emejeamara Francis
Abstract:
The future of potentials wind energy applications within suburban/urban areas are currently faced with various problems. These include insufficient assessment of urban wind resource, and the effectiveness of commercial gust control solutions as well as unavailability of effective and cheaper valuable tools for scoping the potentials of urban wind applications within built-up environments. In order to achieve effective assessment of the potentials of urban wind installations, an estimation of the total energy that would be available to them were effective control systems to be used, and evaluating the potential power to be generated by the wind system is required. This paper presents a methodology of predicting the power generated by a wind system operating within an urban wind resource. This method was developed by using high temporal resolution wind measurements from eight potential sites within the urban and suburban environment as inputs to a vertical axis wind turbine multiple stream tube model. A relationship between the unsteady performance coefficient obtained from the stream tube model results and turbulence intensity was demonstrated. Hence, an analytical methodology for estimating the unsteady power coefficient at a potential turbine site is proposed. This is combined with analytical models that were developed to predict the wind speed and the excess energy (EEC) available in estimating the potential power generated by wind systems at different heights within a built environment. Estimates of turbulence intensities, wind speed, EEC and turbine performance based on the current methodology allow a more complete assessment of available wind resource and potential urban wind projects. This methodology is applied to four major UK cities namely Leeds, Manchester, London and Edinburgh and the potential to map the turbine performance at different heights within a typical urban city is demonstrated.Keywords: small-scale wind, turbine power, urban wind energy, turbulence intensity, excess energy content
Procedia PDF Downloads 2771156 High School Gain Analytics From National Assessment Program – Literacy and Numeracy and Australian Tertiary Admission Rankin Linkage
Authors: Andrew Laming, John Hattie, Mark Wilson
Abstract:
Nine Queensland Independent high schools provided deidentified student-matched ATAR and NAPLAN data for all 1217 ATAR graduates since 2020 who also sat NAPLAN at the school. Graduating cohorts from the nine schools contained a mean 100 ATAR graduates with previous NAPLAN data from their school. Excluded were vocational students (mean=27) and any ATAR graduates without NAPLAN data (mean=20). Based on Index of Community Socio-Educational Access (ICSEA) prediction, all schools had larger that predicted proportions of their students graduating with ATARs. There were an additional 173 students not releasing their ATARs to their school (14%), requiring this data to be inferred by schools. Gain was established by first converting each student’s strongest NAPLAN domain to a statewide percentile, then subtracting this result from final ATAR. The resulting ‘percentile shift’ was corrected for plausible ATAR participation at each NAPLAN level. Strongest NAPLAN domain had the highest correlation with ATAR (R2=0.58). RESULTS School mean NAPLAN scores fitted ICSEA closely (R2=0.97). Schools achieved a mean cohort gain of two ATAR rankings, but only 66% of students gained. This ranged from 46% of top-NAPLAN decile students gaining, rising to 75% achieving gains outside the top decile. The 54% of top-decile students whose ATAR fell short of prediction lost a mean 4.0 percentiles (or 6.2 percentiles prior to correction for regression to the mean). 71% of students in smaller schools gained, compared to 63% in larger schools. NAPLAN variability in each of the 13 ICSEA1100 cohorts was 17%, with both intra-school and inter-school variation of these values extremely low (0.3% to 1.8%). Mean ATAR change between years in each school was just 1.1 ATAR ranks. This suggests consecutive school cohorts and ICSEA-similar schools share very similar distributions and outcomes over time. Quantile analysis of the NAPLAN/ATAR revealed heteroscedasticity, but splines offered little additional benefit over simple linear regression. The NAPLAN/ATAR R2 was 0.33. DISCUSSION Standardised data like NAPLAN and ATAR offer educators a simple no-cost progression metric to analyse performance in conjunction with their internal test results. Change is expressed in percentiles, or ATAR shift per student, which is layperson intuitive. Findings may also reduce ATAR/vocational stream mismatch, reveal proportions of cohorts meeting or falling short of expectation and demonstrate by how much. Finally, ‘crashed’ ATARs well below expectation are revealed, which schools can reasonably work to minimise. The percentile shift method is neither value-add nor a growth percentile. In the absence of exit NAPLAN testing, this metric is unable to discriminate academic gain from legitimate ATAR-maximizing strategies. But by controlling for ICSEA, ATAR proportion variation and student mobility, it uncovers progression to ATAR metrics which are not currently publicly available. However achieved, ATAR maximisation is a sought-after private good. So long as standardised nationwide data is available, this analysis offers useful analytics for educators and reasonable predictivity when counselling subsequent cohorts about their ATAR prospects.Keywords: NAPLAN, ATAR, analytics, measurement, gain, performance, data, percentile, value-added, high school, numeracy, reading comprehension, variability, regression to the mean
Procedia PDF Downloads 681155 Eco-Products in Day-to-Day Life: A Catalyst for Achieving Sustainability
Authors: Rani Fernandez
Abstract:
As global concerns regarding environmental degradation and climate change intensify, the imperative for sustainable living has never been more critical. This research delves into the role of eco-products in everyday life as a pivotal strategy for achieving sustainability. The study investigates the awareness, adoption, and impact of eco-friendly products on individual and community levels. The research employs a mixed-methods approach, combining surveys, interviews, and case studies to explore consumer perceptions, behaviours, and motivations surrounding the use of eco-products. Additionally, life cycle assessments are conducted to evaluate the environmental footprint of selected eco-products, shedding light on their tangible contributions to sustainability. The findings reveal the diverse range of eco-products available in the market, from biodegradable packaging to energy-efficient appliances, and the extent to which consumers integrate these products into their daily routines. Moreover, the research examines the challenges and opportunities associated with widespread adoption, considering factors such as cost, accessibility, and efficacy. In addition to individual consumption patterns, the study investigates the broader societal impact of eco-product integration. It explores the potential for eco-products to drive systemic change by influencing supply chains, corporate practices, and government policies. The research highlights successful case studies of communities or businesses that have effectively incorporated eco-products, providing valuable insights into scalable models for sustainability. Ultimately, this research contributes to the discourse on sustainable living by elucidating the pivotal role of eco-products in shaping environmentally conscious behaviours. By understanding the dynamics of eco-product adoption, policymakers, businesses, and individuals can collaboratively work towards a more sustainable future. The implications of this study extend beyond academia, informing practical strategies for fostering a global shift towards sustainable consumption and production.Keywords: eco-friendly, sustainablity, environment, climate change
Procedia PDF Downloads 401154 Good Practices for Model Structure Development and Managing Structural Uncertainty in Decision Making
Authors: Hossein Afzali
Abstract:
Increasingly, decision analytic models are used to inform decisions about whether or not to publicly fund new health technologies. It is well noted that the accuracy of model predictions is strongly influenced by the appropriateness of model structuring. However, there is relatively inadequate methodological guidance surrounding this issue in guidelines developed by national funding bodies such as the Australian Pharmaceutical Benefits Advisory Committee (PBAC) and The National Institute for Health and Care Excellence (NICE) in the UK. This presentation aims to discuss issues around model structuring within decision making with a focus on (1) the need for a transparent and evidence-based model structuring process to inform the most appropriate set of structural aspects as the base case analysis; (2) the need to characterise structural uncertainty (If there exist alternative plausible structural assumptions (or judgements), there is a need to appropriately characterise the related structural uncertainty). The presentation will provide an opportunity to share ideas and experiences on how the guidelines developed by national funding bodies address the above issues and identify areas for further improvements. First, a review and analysis of the literature and guidelines developed by PBAC and NICE will be provided. Then, it will be discussed how the issues around model structuring (including structural uncertainty) are not handled and justified in a systematic way within the decision-making process, its potential impact on the quality of public funding decisions, and how it should be presented in submissions to national funding bodies. This presentation represents a contribution to the good modelling practice within the decision-making process. Although the presentation focuses on the PBAC and NICE guidelines, the discussion can be applied more widely to many other national funding bodies that use economic evaluation to inform funding decisions but do not transparently address model structuring issues e.g. the Medical Services Advisory Committee (MSAC) in Australia or the Canadian Agency for Drugs and Technologies in Health.Keywords: decision-making process, economic evaluation, good modelling practice, structural uncertainty
Procedia PDF Downloads 1861153 Ragging and Sludging Measurement in Membrane Bioreactors
Authors: Pompilia Buzatu, Hazim Qiblawey, Albert Odai, Jana Jamaleddin, Mustafa Nasser, Simon J. Judd
Abstract:
Membrane bioreactor (MBR) technology is challenged by the tendency for the membrane permeability to decrease due to ‘clogging’. Clogging includes ‘sludging’, the filling of the membrane channels with sludge solids, and ‘ragging’, the aggregation of short filaments to form long rag-like particles. Both sludging and ragging demand manual intervention to clear out the solids, which is time-consuming, labour-intensive and potentially damaging to the membranes. These factors impact on costs more significantly than membrane surface fouling which, unlike clogging, is largely mitigated by the chemical clean. However, practical evaluation of MBR clogging has thus far been limited. This paper presents the results of recent work attempting to quantify sludging and clogging based on simple bench-scale tests. Results from a novel ragging simulation trial indicated that rags can be formed within 24-36 hours from dispersed < 5 mm-long filaments at concentrations of 5-10 mg/L under gently agitated conditions. Rag formation occurred for both a cotton wool standard and samples taken from an operating municipal MBR, with between 15% and 75% of the added fibrous material forming a single rag. The extent of rag formation depended both on the material type or origin – lint from laundering operations forming zero rags – and the filament length. Sludging rates were quantified using a bespoke parallel-channel test cell representing the membrane channels of an immersed flat sheet MBR. Sludge samples were provided from two local MBRs, one treating municipal and the other industrial effluent. Bulk sludge properties measured comprised mixed liquor suspended solids (MLSS) concentration, capillary suction time (CST), particle size, soluble COD (sCOD) and rheology (apparent viscosity μₐ vs shear rate γ). The fouling and sludging propensity of the sludge was determined using the test cell, ‘fouling’ being quantified as the pressure incline rate against flux via the flux step test (for which clogging was absent) and sludging by photographing the channel and processing the image to determine the ratio of the clogged to unclogged regions. A substantial difference in rheological and fouling behaviour was evident between the two sludge sources, the industrial sludge having a higher viscosity but less shear-thinning than the municipal. Fouling, as manifested by the pressure increase Δp/Δt, as a function of flux from classic flux-step experiments (where no clogging was evident), was more rapid for the industrial sludge. Across all samples of both sludge origins the expected trend of increased fouling propensity with increased CST and sCOD was demonstrated, whereas no correlation was observed between clogging rate and these parameters. The relative contribution of fouling and clogging was appraised by adjusting the clogging propensity via increasing the MLSS both with and without a commensurate increase in the COD. Results indicated that whereas for the municipal sludge the fouling propensity was affected by the increased sCOD, there was no associated increased in the sludging propensity (or cake formation). The clogging rate actually decreased on increasing the MLSS. Against this, for the industrial sludge the clogging rate dramatically increased with solids concentration despite a decrease in the soluble COD. From this was surmised that sludging did not relate to fouling.Keywords: clogging, membrane bioreactors, ragging, sludge
Procedia PDF Downloads 1781152 Parking Service Effectiveness at Commercial Malls
Authors: Ahmad AlAbdullah, Ali AlQallaf, Mahdi Hussain, Mohammed AlAttar, Salman Ashknani, Magdy Helal
Abstract:
We study the effectiveness of the parking service provided at Kuwaiti commercial malls and explore potential problems and feasible improvements. Commercial malls are important to Kuwaitis as the entertainment and shopping centers due to the lack of other alternatives. The difficulty and relatively long times wasted in finding a parking spot at the mall are real annoyances. We applied queuing analysis to one of the major malls that offer paid-parking (1040 parking spots) in addition to free parking. Patrons of the mall usually complained of the traffic jams and delays at entering the paid parking (average delay to park exceeds 15 min for about 62% of the patrons, while average time spent in the mall is about 2.6 hours). However, the analysis showed acceptable service levels at the check-in gates of the parking garage. Detailed review of the vehicle movement at the gateways indicated that arriving and departing cars both had to share parts of the gateway to the garage, which caused the traffic jams and delays. A simple comparison we made indicated that the largest commercial mall in Kuwait does not suffer such parking issues, while other smaller, yet important malls do, including the one we studied. It was suggested that well-designed inlets and outlets of that gigantic mall permitted smooth parking despite being totally free and mall is the first choice for most people for entertainment and shopping. A simulation model is being developed for further analysis and verification. Simulation can overcome the mathematical difficulty in using non-Poisson queuing models. The simulation model is used to explore potential changes to the parking garage entrance layout. And with the inclusion of the drivers’ behavior inside the parking, effectiveness indicators can be derived to address the economic feasibility of extending the parking capacity and increasing service levels. Outcomes of the study are planned to be generalized as appropriate to other commercial malls in KuwaitKeywords: commercial malls, parking service, queuing analysis, simulation modeling
Procedia PDF Downloads 3401151 Teachers Engagement to Teaching: Exploring Australian Teachers’ Attribute Constructs of Resilience, Adaptability, Commitment, Self/Collective Efficacy Beliefs
Authors: Lynn Sheridan, Dennis Alonzo, Hoa Nguyen, Andy Gao, Tracy Durksen
Abstract:
Disruptions to teaching (e.g., COVID-related) have increased work demands for teachers. There is an opportunity for research to explore evidence-informed steps to support teachers. Collective evidence informs data on teachers’ personal attributes (e.g., self-efficacy beliefs) in the workplace are seen to promote success in teaching and support teacher engagement. Teacher engagement plays a role in students’ learning and teachers’ effectiveness. Engaged teachers are better at overcoming work-related stress, burnout and are more likely to take on active roles. Teachers’ commitment is influenced by a host of personal (e.g., teacher well-being) and environmental factors (e.g., job stresses). The job demands-resources model provided a conceptual basis for examining how teachers’ well-being, and is influenced by job demands and job resources. Job demands potentially evoke strain and exceed the employee’s capability to adapt. Job resources entail what the job offers to individual teachers (e.g., organisational support), helping to reduce job demands. The application of the job demands-resources model involves gathering an evidence-base of and connection to personal attributes (job resources). The study explored the association between constructs (resilience, adaptability, commitment, self/collective efficacy) and a teacher’s engagement with the job. The paper sought to elaborate on the model and determine the associations between key constructs of well-being (resilience, adaptability), commitment, and motivation (self and collective-efficacy beliefs) to teachers’ engagement in teaching. Data collection involved online a multi-dimensional instrument using validated items distributed from 2020-2022. The instrument was designed to identify construct relationships. The participant number was 170. Data Analysis: The reliability coefficients, means, standard deviations, skewness, and kurtosis statistics for the six variables were completed. All scales have good reliability coefficients (.72-.96). A confirmatory factor analysis (CFA) and structural equation model (SEM) were performed to provide measurement support and to obtain latent correlations among factors. The final analysis was performed using structural equation modelling. Several fit indices were used to evaluate the model fit, including chi-square statistics and root mean square error of approximation. The CFA and SEM analysis was performed. The correlations of constructs indicated positive correlations exist, with the highest found between teacher engagement and resilience (r=.80) and the lowest between teacher adaptability and collective teacher efficacy (r=.22). Given the associations; we proceeded with CFA. The CFA yielded adequate fit: CFA fit: X (270, 1019) = 1836.79, p < .001, RMSEA = .04, and CFI = .94, TLI = .93 and SRMR = .04. All values were within the threshold values, indicating a good model fit. Results indicate that increasing teacher self-efficacy beliefs will increase a teacher’s level of engagement; that teacher ‘adaptability and resilience are positively associated with self-efficacy beliefs, as are collective teacher efficacy beliefs. Implications for school leaders and school systems: 1. investing in increasing teachers’ sense of efficacy beliefs to manage work demands; 2. leadership approaches can enhance teachers' adaptability and resilience; and 3. a culture of collective efficacy support. Preparing teachers for now and in the future offers an important reminder to policymakers and school leaders on the importance of supporting teachers’ personal attributes when faced with the challenging demands of the job.Keywords: collective teacher efficacy, teacher self-efficacy, job demands, teacher engagement
Procedia PDF Downloads 1271150 The Effects of Cardiovascular Risk on Age-Related Cognitive Decline in Healthy Older Adults
Authors: A. Badran, M. Hollocks, H. Markus
Abstract:
Background: Common risk factors for cardiovascular disease are associated with age-related cognitive decline. There has been much interest in treating modifiable cardiovascular risk factors in the hope of reducing cognitive decline. However, there is currently no validated neuropsychological test to assess the subclinical cognitive effects of vascular risk. The Brief Memory and Executive Test (BMET) is a clinical screening tool, which was originally designed to be sensitive and specific to Vascular Cognitive Impairment (VCI), an impairment characterised by decline in frontally-mediated cognitive functions (e.g. Executive Function and Processing Speed). Objective: To cross-sectionally assess the validity of the BMET as a measure of the subclinical effects of vascular risk on cognition, in an otherwise healthy elderly cohort. Methods: Data from 346 participants (57 ± 10 years) without major neurological or psychiatric disorders were included in this study, gathered as part of a previous multicentre validation study for the BMET. Framingham Vascular Age was used as a surrogate measure of vascular risk, incorporating several established risk factors. Principal Components Analysis of the subtests was used to produce common constructs: an index for Memory and another for Executive Function/Processing Speed. Univariate General Linear models were used to relate Vascular Age to performance on Executive Function/Processing Speed and Memory subtests of the BMET, adjusting for Age, Premorbid Intelligence and Ethnicity. Results: Adverse vascular risk was associated with poorer performance on both the Memory and Executive Function/Processing Speed indices, adjusted for Age, Premorbid Intelligence and Ethnicity (p=0.011 and p<0.001, respectively). Conclusions: Performance on the BMET reflects the subclinical effects of vascular risk on cognition, in age-related cognitive decline. Vascular risk is associated with decline in both Executive Function/Processing Speed and Memory groups of subtests. Future studies are needed to explore whether treating vascular risk factors can effectively reduce age-related cognitive decline.Keywords: age-related cognitive decline, vascular cognitive impairment, subclinical cerebrovascular disease, cognitive aging
Procedia PDF Downloads 4711149 [Keynote Talk]: Knowledge Codification and Innovation Success within Digital Platforms
Authors: Wissal Ben Arfi, Lubica Hikkerova, Jean-Michel Sahut
Abstract:
This study examines interfirm networks in the digital transformation era, and in particular, how tacit knowledge codification affects innovation success within digital platforms. Hence, one of the most important features of digital transformation and innovation process outcomes is the emergence of digital platforms, as an interfirm network, at the heart of open innovation. This research aims to illuminate how digital platforms influence inter-organizational innovation through virtual team interactions and knowledge sharing practices within an interfirm network. Consequently, it contributes to the respective strategic management literature on new product development (NPD), open innovation, industrial management, and its emerging interfirm networks’ management. The empirical findings show, on the one hand, that knowledge conversion may be enhanced, especially by the socialization which seems to be the most important phase as it has played a crucial role to hold the virtual team members together. On the other hand, in the process of socialization, the tacit knowledge codification is crucial because it provides the structure needed for the interfirm network actors to interact and act to reach common goals which favor the emergence of open innovation. Finally, our results offer several conditions necessary, but not always sufficient, for interfirm managers involved in NPD and innovation concerning strategies to increasingly shape interconnected and borderless markets and business collaborations. In the digital transformation era, the need for adaptive and innovative business models as well as new and flexible network forms is becoming more significant than ever. Supported by technological advancements and digital platforms, companies could benefit from increased market opportunities and creating new markets for their innovations through alliances and collaborative strategies, as a mode of reducing or eliminating uncertainty environments or entry barriers. Consequently, an efficient and well-structured interfirm network is essential to create network capabilities, to ensure tacit knowledge sharing, to enhance organizational learning and to foster open innovation success within digital platforms.Keywords: interfirm networks, digital platform, virtual teams, open innovation, knowledge sharing
Procedia PDF Downloads 1301148 An Approach to Autonomous Drones Using Deep Reinforcement Learning and Object Detection
Authors: K. R. Roopesh Bharatwaj, Avinash Maharana, Favour Tobi Aborisade, Roger Young
Abstract:
Presently, there are few cases of complete automation of drones and its allied intelligence capabilities. In essence, the potential of the drone has not yet been fully utilized. This paper presents feasible methods to build an intelligent drone with smart capabilities such as self-driving, and obstacle avoidance. It does this through advanced Reinforcement Learning Techniques and performs object detection using latest advanced algorithms, which are capable of processing light weight models with fast training in real time instances. For the scope of this paper, after researching on the various algorithms and comparing them, we finally implemented the Deep-Q-Networks (DQN) algorithm in the AirSim Simulator. In future works, we plan to implement further advanced self-driving and object detection algorithms, we also plan to implement voice-based speech recognition for the entire drone operation which would provide an option of speech communication between users (People) and the drone in the time of unavoidable circumstances. Thus, making drones an interactive intelligent Robotic Voice Enabled Service Assistant. This proposed drone has a wide scope of usability and is applicable in scenarios such as Disaster management, Air Transport of essentials, Agriculture, Manufacturing, Monitoring people movements in public area, and Defense. Also discussed, is the entire drone communication based on the satellite broadband Internet technology for faster computation and seamless communication service for uninterrupted network during disasters and remote location operations. This paper will explain the feasible algorithms required to go about achieving this goal and is more of a reference paper for future researchers going down this path.Keywords: convolution neural network, natural language processing, obstacle avoidance, satellite broadband technology, self-driving
Procedia PDF Downloads 2511147 Rapid Formation of Ortho-Boronoimines and Derivatives for Reversible and Dynamic Bioconjugation Under Physiological Conditions
Authors: Nicholas C. Rose, Christopher D. Spicer
Abstract:
The regeneration of damaged or diseased tissues would provide an invaluable therapeutic tool in biological research and medicine. Cells must be provided with a number of different biochemical signals in order to form mature tissue through complex signaling networks that are difficult to recreate in synthetic materials. The ability to attach and detach bioactive proteins from material in an iterative and dynamic manner would therefore present a powerful way to mimic natural biochemical signaling cascades for tissue growth. We propose to reversibly attach these bioactive proteins using ortho-boronoimine (oBI) linkages and related derivatives formed by the reaction of an ortho-boronobenzaldehyde with a nucleophilic amine derivative. To enable the use of oBIs for biomaterial modification, we have studied binding and cleavage processes with precise detail in the context of small molecule models. A panel of oBI complexes has been synthesized and screened using a novel Förster resonance energy transfer (FRET) assay, using a cyanine dye FRET pair (Cy3 and Cy5), to identify the most reactive boron-aldehyde/amine nucleophile pairs. Upon conjugation of the dyes, FRET occurs under Cy3 excitation and the resultant ratio of Cy3:Cy5 emission directly correlates to conversion. Reaction kinetics and equilibria can be accurately quantified for reactive pairs, with dissociation constants of oBI derivatives in water (KD) found to span 9-orders of magnitude (10⁻²-10⁻¹¹ M). These studies have provided us with a better understanding of oBI linkages that we hope to exploit to reversibly attach bioconjugates to materials. The long-term aim of the project is to develop a modular biomaterial platform that can be used to help combat chronic diseases such as osteoarthritis, heart disease, and chronic wounds by providing cells with potent biological stimuli for tissue engineering.Keywords: dynamic, bioconjugation, bornoimine, rapid, physiological
Procedia PDF Downloads 961146 Estimation of Physico-Mechanical Properties of Tuffs (Turkey) from Indirect Methods
Authors: Mustafa Gok, Sair Kahraman, Mustafa Fener
Abstract:
In rock engineering applications, determining uniaxial compressive strength (UCS), Brazilian tensile strength (BTS), and basic index properties such as density, porosity, and water absorption is crucial for the design of both underground and surface structures. However, obtaining reliable samples for direct testing, especially from rocks that weather quickly and have low strength, is often challenging. In such cases, indirect methods provide a practical alternative to estimate the physical and mechanical properties of these rocks. In this study, tuff samples collected from the Cappadocia region (Nevşehir) in Turkey were subjected to indirect testing methods. Over 100 tests were conducted, using needle penetrometer index (NPI), point load strength index (PLI), and disc shear index (BPI) to estimate the uniaxial compressive strength (UCS), Brazilian tensile strength (BTS), density, and water absorption index of the tuffs. The relationships between the results of these indirect tests and the target physical properties were evaluated using simple and multiple regression analyses. The findings of this research reveal strong correlations between the indirect methods and the mechanical properties of the tuffs. Both uniaxial compressive strength and Brazilian tensile strength could be accurately predicted using NPI, PLI, and BPI values. The regression models developed in this study allow for rapid, cost-effective assessments of tuff strength in cases where direct testing is impractical. These results are particularly valuable for geological engineering applications, where time and resource constraints exist. This study highlights the significance of using indirect methods as reliable predictors of the mechanical behavior of weak rocks like tuffs. Further research is recommended to explore the application of these methods to other rock types with similar characteristics. Further research is required to compare the results with those of established direct test methods.Keywords: brazilian tensile strength, disc shear strength, indirect methods, tuffs, uniaxial compressive strength
Procedia PDF Downloads 171145 Comparative Parametric Analysis on the Dynamic Response of Fibre Composite Beams with Debonding
Authors: Indunil Jayatilake, Warna Karunasena
Abstract:
Fiber Reinforced Polymer (FRP) composites enjoy an array of applications ranging from aerospace, marine and military to automobile, recreational and civil industry due to their outstanding properties. A structural glass fiber reinforced polymer (GFRP) composite sandwich panel made from E-glass fiber skin and a modified phenolic core has been manufactured in Australia for civil engineering applications. One of the major mechanisms of damage in FRP composites is skin-core debonding. The presence of debonding is of great concern not only because it severely affects the strength but also it modifies the dynamic characteristics of the structure, including natural frequency and vibration modes. This paper deals with the investigation of the dynamic characteristics of a GFRP beam with single and multiple debonding by finite element based numerical simulations and analyses using the STRAND7 finite element (FE) software package. Three-dimensional computer models have been developed and numerical simulations were done to assess the dynamic behavior. The FE model developed has been validated with published experimental, analytical and numerical results for fully bonded as well as debonded beams. A comparative analysis is carried out based on a comprehensive parametric investigation. It is observed that the reduction in natural frequency is more affected by single debonding than the equally sized multiple debonding regions located symmetrically to the single debonding position. Thus it is revealed that a large single debonding area leads to more damage in terms of natural frequency reduction than isolated small debonding zones of equivalent area, appearing in the GFRP beam. Furthermore, the extents of natural frequency shifts seem mode-dependent and do not seem to have a monotonous trend of increasing with the mode numbers.Keywords: debonding, dynamic response, finite element modelling, novel FRP beams
Procedia PDF Downloads 1171144 Application of WHO's Guideline to Evaluating Apps for Smoking Cessation
Authors: Suin Seo, Sung-Il Cho
Abstract:
Background: The use of mobile apps for smoking cessation has grown exponentially in recent years. Yet, there were limited researches which evaluated the quality of smoking cessation apps to our knowledge. In most cases, a clinical practice guideline which is focused on clinical physician was used as an evaluation tool. Objective: The objective of this study was to develop a user-centered measure for quality of mobile smoking cessation apps. Methods: A literature search was conducted to identify articles containing explicit smoking cessation guideline for smoker published until January 2018. WHO’s guide for tobacco users to quit was adopted for evaluation tool which assesses smoker-oriented contents of smoking cessation apps. Compared to the clinical practice guideline, WHO guideline was designed for smokers (non-specialist). On the basis of existing criteria which was developed based on 2008 clinical practice guideline for Treating Tobacco Use and Dependence, evaluation tool was modified and developed by an expert panel. Results: There were five broad categories of criteria that were identified including five objective quality scales: enhancing motivation, assistance with a planning and making quit attempts, preparation for relapse, self-efficacy, connection to smoking. Enhancing motivation and assistance with planning and making quit attempts were similar to contents of clinical practice guideline, but preparation for relapse, self-efficacy and connection to smoking (environment or habit which reminds of smoking) only existed on WHO guideline. WHO guideline had more user-centered elements than clinical guideline. Especially, self-efficacy is the most important determinant of behavior change in accordance with many health behavior change models. With the WHO guideline, it is now possible to analyze the content of the app in the light of a health participant, not a provider. Conclusion: The WHO guideline evaluation tool is a simple, reliable and smoker-centered tool for assessing the quality of mobile smoking cessation apps. It can also be used to provide a checklist for the development of new high-quality smoking cessation apps.Keywords: smoking cessation, evaluation, mobile application, WHO, guideline
Procedia PDF Downloads 1881143 Performance of AquaCrop Model for Simulating Maize Growth and Yield Under Varying Sowing Dates in Shire Area, North Ethiopia
Authors: Teklay Tesfay, Gebreyesus Brhane Tesfahunegn, Abadi Berhane, Selemawit Girmay
Abstract:
Adjusting the proper sowing date of a crop at a particular location with a changing climate is an essential management option to maximize crop yield. However, determining the optimum sowing date for rainfed maize production through field experimentation requires repeated trials for many years in different weather conditions and crop management. To avoid such long-term experimentation to determine the optimum sowing date, crop models such as AquaCrop are useful. Therefore, the overall objective of this study was to evaluate the performance of AquaCrop model in simulating maize productivity under varying sowing dates. A field experiment was conducted for two consecutive cropping seasons by deploying four maize seed sowing dates in a randomized complete block design with three replications. Input data required to run this model are stored as climate, crop, soil, and management files in the AquaCrop database and adjusted through the user interface. Observed data from separate field experiments was used to calibrate and validate the model. AquaCrop model was validated for its performance in simulating the green canopy and aboveground biomass of maize for the varying sowing dates based on the calibrated parameters. Results of the present study showed that there was a good agreement (an overall R2 =, Ef= d= RMSE =) between measured and simulated values of the canopy cover and biomass yields. Considering the overall values of the statistical test indicators, the performance of the model to predict maize growth and biomass yield was successful, and so this is a valuable tool help for decision-making. Hence, this calibrated and validated model is suggested to use for determining optimum maize crop sowing date for similar climate and soil conditions to the study area, instead of conducting long-term experimentation.Keywords: AquaCrop model, calibration, validation, simulation
Procedia PDF Downloads 711142 A Multi-Criteria Decision Method for the Recruitment of Academic Personnel Based on the Analytical Hierarchy Process and the Delphi Method in a Neutrosophic Environment
Authors: Antonios Paraskevas, Michael Madas
Abstract:
For a university to maintain its international competitiveness in education, it is essential to recruit qualitative academic staff as it constitutes its most valuable asset. This selection demonstrates a significant role in achieving strategic objectives, particularly by emphasizing a firm commitment to the exceptional student experience and innovative teaching and learning practices of high quality. In this vein, the appropriate selection of academic staff establishes a very important factor of competitiveness, efficiency and reputation of an academic institute. Within this framework, our work demonstrates a comprehensive methodological concept that emphasizes the multi-criteria nature of the problem and how decision-makers could utilize our approach in order to proceed to the appropriate judgment. The conceptual framework introduced in this paper is built upon a hybrid neutrosophic method based on the Neutrosophic Analytical Hierarchy Process (N-AHP), which uses the theory of neutrosophy sets and is considered suitable in terms of a significant degree of ambiguity and indeterminacy observed in the decision-making process. To this end, our framework extends the N-AHP by incorporating the Neutrosophic Delphi Method (N-DM). By applying the N-DM, we can take into consideration the importance of each decision-maker and their preferences per evaluation criterion. To the best of our knowledge, the proposed model is the first which applies the Neutrosophic Delphi Method in the selection of academic staff. As a case study, it was decided to use our method for a real problem of academic personnel selection, having as the main goal to enhance the algorithm proposed in previous scholars’ work, and thus taking care of the inherent ineffectiveness which becomes apparent in traditional multi-criteria decision-making methods when dealing with situations alike. As a further result, we prove that our method demonstrates greater applicability and reliability when compared to other decision models.Keywords: multi-criteria decision making methods, analytical hierarchy process, delphi method, personnel recruitment, neutrosophic set theory
Procedia PDF Downloads 1171141 An Empirical Study of the Moderation Effects of Commitment, Trust, and Relationship Value in the Relation of Goods and Services Related to Business to Business Brand Images on Customer Loyalty
Authors: Jorge Luis Morales Romero, Enrique Murillo Othón
Abstract:
Business to business (B2B) relationships generally go beyond a purely profit-based result, with firms seeking to maintain a relationship for many years because a breakup or getting a new supplier can be very costly. Therefore, identifying the factors which determine a successful relationship in the long term is of great interest to companies. That is why their reputation and the brand image that customers have of them are among the main factors that can achieve a successful relationship; Because of the positive effect which is driven by the client’s loyalty. Additionally, the perception that a customer may have about a brand is different when it is related to goods or to services. Thereby, they create in their minds their own brand image of it based on the past experiences they have had; Thus, a positive relationship is established between goods-related brand image, service-related brand image, and customer loyalty. The present investigation examines the boundary conditions of said relationship by testing the moderating effects of trust, commitment, and relationship value in a B2B environment. All the variables were tested independently as moderators for service-related brand image/loyalty and for goods-related brand image/loyalty, as they are assumed to be separate variables. Survey data was collected through interviews with customers that have both a product-buying relationship and a service relationship with a global B2B brand of healthcare equipment operating in the Mexican healthcare market. Interviewed respondents were either the user or the purchasing manager and/or the responsible for the equipment maintenance for the customer organization. Hence, they were appropriate informants regarding the B2B relationship with this healthcare brand. The moderation models were estimated using the PROCESS macro for the Statistical Package for the Social Sciences Software (SPSS). Results show statistical evidence that both Relationship Value and Trust are significant moderators for the service-related brand image/loyalty relation but not significant for the goods-related brand/loyalty relation. On the other hand, Commitment results in a significant moderator for the goods-related brand/loyalty relation but is not significant for the service-related brand image/loyalty relation.Keywords: commitment, trust, relationship value, loyalty, B2B, moderator
Procedia PDF Downloads 931140 Contributions of Women to the Development of Hausa Literature as an Effective Means of Public Enlightenment: The Case of a 19th Century Female Scholar Maryam Bint Uthman Ibn Foduye
Authors: Balbasatu Ibrahim
Abstract:
In the 19th century, Hausaland an Islamic revolution known as the Sokoto Jihad took place that led to the establishment of the Sokoto Caliphate in 1804 under the leadership of the famous Sheik Uthman Bn Fodiye. Before the Jihad movement in Hausaland (now Northern Nigeria), women were left in ignorance and were used and dumped like old kitchen utensils. The sheik and his followers did their best to actualising women’s right to education by using their female family members as role models who were highly educated and renowned scholars. After the Jihad with the establishment of an Islamic state, the women scholars initiated different strategies to teach the generality of the women. The most efficient strategy was the ‘Yantaru Movement founded by Nana Asma’u the daughter of Sheikh Uthman Bn Fodiye in collaboration with her sisters around 1840. The ‘Yantaru movement is a women’s educational movement aimed at enlightening women in rural and urban areas. The move helped in massively mobilizing women for education. In addition to town pupils, women from villages and throughout the nooks and crannies of metropolitan Sokoto participated in the movement in the search for knowledge. Thus, the birth of the ‘Yantaru system of women’s education. The ‘Yantaru operates the three-tier system at village, town and the metropolitan capital of Sokoto. ‘Yantaru functions include imparting knowledge to elderly women and young girls. Step down enlightenment program on returning home. The most effective medium of communication in the ‘Yantaru movement was through poetry where scholars composed educational poems which were memorized by the ‘Yantaru, who on return recite it to fellow women at home. Through this system, many women were educated. This paper translated and examines one of such educative poems written by the second leader of the ‘Yantaru Movement Maryam Bn Uthman Bn Fodiye in 1855.Keywords: English, Hausa language, public enlightenment, Maryam Bint Uthman Ibn Foduye
Procedia PDF Downloads 3661139 Early Prediction of Diseases in a Cow for Cattle Industry
Authors: Ghufran Ahmed, Muhammad Osama Siddiqui, Shahbaz Siddiqui, Rauf Ahmad Shams Malick, Faisal Khan, Mubashir Khan
Abstract:
In this paper, a machine learning-based approach for early prediction of diseases in cows is proposed. Different ML algos are applied to extract useful patterns from the available dataset. Technology has changed today’s world in every aspect of life. Similarly, advanced technologies have been developed in livestock and dairy farming to monitor dairy cows in various aspects. Dairy cattle monitoring is crucial as it plays a significant role in milk production around the globe. Moreover, it has become necessary for farmers to adopt the latest early prediction technologies as the food demand is increasing with population growth. This highlight the importance of state-ofthe-art technologies in analyzing how important technology is in analyzing dairy cows’ activities. It is not easy to predict the activities of a large number of cows on the farm, so, the system has made it very convenient for the farmers., as it provides all the solutions under one roof. The cattle industry’s productivity is boosted as the early diagnosis of any disease on a cattle farm is detected and hence it is treated early. It is done on behalf of the machine learning output received. The learning models are already set which interpret the data collected in a centralized system. Basically, we will run different algorithms on behalf of the data set received to analyze milk quality, and track cows’ health, location, and safety. This deep learning algorithm draws patterns from the data, which makes it easier for farmers to study any animal’s behavioral changes. With the emergence of machine learning algorithms and the Internet of Things, accurate tracking of animals is possible as the rate of error is minimized. As a result, milk productivity is increased. IoT with ML capability has given a new phase to the cattle farming industry by increasing the yield in the most cost-effective and time-saving manner.Keywords: IoT, machine learning, health care, dairy cows
Procedia PDF Downloads 71