Search results for: five factor model of personality
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 20769

Search results for: five factor model of personality

5769 Annotation Ontology for Semantic Web Development

Authors: Hadeel Al Obaidy, Amani Al Heela

Abstract:

The main purpose of this paper is to examine the concept of semantic web and the role that ontology and semantic annotation plays in the development of semantic web services. The paper focuses on semantic web infrastructure illustrating how ontology and annotation work to provide the learning capabilities for building content semantically. To improve productivity and quality of software, the paper applies approaches, notations and techniques offered by software engineering. It proposes a conceptual model to develop semantic web services for the infrastructure of web information retrieval system of digital libraries. The developed system uses ontology and annotation to build a knowledge based system to define and link the meaning of a web content to retrieve information for users’ queries. The results are more relevant through keywords and ontology rule expansion that will be more accurate to satisfy the requested information. The level of results accuracy would be enhanced since the query semantically analyzed work with the conceptual architecture of the proposed system.

Keywords: semantic web services, software engineering, semantic library, knowledge representation, ontology

Procedia PDF Downloads 161
5768 Iterative Design Process for Development and Virtual Commissioning of Plant Control Software

Authors: Thorsten Prante, Robert Schöch, Ruth Fleisch, Vaheh Khachatouri, Alexander Walch

Abstract:

The development of industrial plant control software is a complex and often very expensive task. One of the core problems is that a lot of the implementation and adaptation work can only be done after the plant hardware has been installed. In this paper, we present our approach to virtually developing and validating plant-level control software of production plants. This way, plant control software can be virtually commissioned before actual ramp-up of a plant, reducing actual commissioning costs and time. Technically, this is achieved by linking the actual plant-wide process control software (often called plant server) and an elaborate virtual plant model together to form an emulation system. Method-wise, we are suggesting a four-step iterative process with well-defined increments and time frame. Our work is based on practical experiences from planning to commissioning and start-up of several cut-to-size plants.

Keywords: iterative system design, virtual plant engineering, plant control software, simulation and emulation, virtual commissioning

Procedia PDF Downloads 472
5767 Evaluation of Diagnosis Performance Based on Pairwise Model Construction and Filtered Data

Authors: Hyun-Woo Cho

Abstract:

It is quite important to utilize right time and intelligent production monitoring and diagnosis of industrial processes in terms of quality and safety issues. When compared with monitoring task, fault diagnosis represents the task of finding process variables responsible causing a specific fault in the process. It can be helpful to process operators who should investigate and eliminate root causes more effectively and efficiently. This work focused on the active use of combining a nonlinear statistical technique with a preprocessing method in order to implement practical real-time fault identification schemes for data-rich cases. To compare its performance to existing identification schemes, a case study on a benchmark process was performed in several scenarios. The results showed that the proposed fault identification scheme produced more reliable diagnosis results than linear methods. In addition, the use of the filtering step improved the identification results for the complicated processes with massive data sets.

Keywords: diagnosis, filtering, nonlinear statistical techniques, process monitoring

Procedia PDF Downloads 224
5766 Influence of Water Hardness on Column Adsorption of Paracetamol by Biomass of Babassu Coconut Shell

Authors: O. M. Couto Junior, I. Matos, I. M. Fonseca, P. A. Arroyo, E. A. Silva, M. A. S. D. Barros

Abstract:

This study was the adsorption of paracetamol from aqueous solutions on fixed beds of activated carbon from babassy coconut shell. Several operation conditions on the shape of breakthrough curves were investigated and proposed model is successfully validated with the literature data and obtained experimental data. The initial paracetamol concentration increases from 20 to 50 mg.L-1, and the break point time decreases, tb, from 18.00 to 10.50 hours. The fraction of unused bed length, HUNB, at break-through point is obtained in the range of 1.62 to 2.81 for 20 to 50 mg.L-1 of initial paracetamol concentration. The presence of Ca+2 and Mg+2 are responsible for increasing the hardness of the water, affects significantly the adsorption kinetics, and lower removal efficiency by adsorption of paracetamol on activated carbons. The axial dispersion coefficients, DL, was constants for concentrated feed solution, but this parameter has different values for deionized and hardness water. The mass transfer coefficient, Ks, was increasing with concentrated feed solution.

Keywords: paracetamol, adsorption, water hardness, activated carbon.

Procedia PDF Downloads 303
5765 A Review of BIM Applications for Heritage and Historic Buildings: Challenges and Solutions

Authors: Reza Yadollahi, Arash Hejazi, Dante Savasta

Abstract:

Building Information Modeling (BIM) is growing so fast in construction projects around the world. Considering BIM's weaknesses in implementing existing heritage and historical buildings, it is critical to facilitate BIM application for such structures. One of the pieces of information to build a model in BIM is to import material and its characteristics. Material library is essential to speed up the entry of project information. To save time and prevent cost overrun, a BIM object material library should be provided. However, historical buildings' lack of information and documents is typically a challenge in renovation and retrofitting projects. Due to the lack of case documents for historic buildings, importing data is a time-consuming task, which can be improved by creating BIM libraries. Based on previous research, this paper reviews the complexities and challenges in BIM modeling for heritage, historic, and architectural buildings. Through identifying the strengths and weaknesses of the standard BIM systems, recommendations are provided to enhance the modeling platform.

Keywords: building Information modeling, historic, heritage buildings, material library

Procedia PDF Downloads 100
5764 Evaluation and Analysis of Light Emitting Diode Distribution in an Indoor Visible Light Communication

Authors: Olawale J. Olaluyi, Ayodele S. Oluwole, O. Akinsanmi, Johnson O. Adeogo

Abstract:

Communication using visible light VLC is considered a cutting-edge technology used for data transmission and illumination since it uses less energy than radio frequency (RF) technology and has a large bandwidth, extended lifespan, and high security. The room's irregular distribution of small base stations, or LED array distribution, is the cause of the obscured area, minimum signal-to-noise ratio (SNR), and received power. In order to maximize the received power distribution and SNR at the center of the room for an indoor VLC system, the researchers offer an innovative model for the placement of eight LED array distributions in this work. We have investigated the arrangement of the LED array distribution with regard to receiving power to fill the open space in the center of the room. The suggested LED array distribution saved 36.2% of the transmitted power, according to the simulation findings. Aside from that, the entire room was equally covered. This leads to an increase in both received power and SNR.

Keywords: visible light communication (VLC), light emitted diodes (LED), optical power distribution, signal-to-noise ratio (SNR).

Procedia PDF Downloads 67
5763 Beyond Sexual Objectification: Moderation Analysis of Trauma and Overexcitability Dynamics in Women

Authors: Ritika Chaturvedi

Abstract:

Introduction: Sexual objectification, characterized by the reduction of an individual to a mere object of sexual desire, remains a pervasive societal issue with profound repercussions on individual well-being. Such experiences, often rooted in systemic and cultural norms, have long-lasting implications for mental and emotional health. This study aims to explore the intricate relationship between experiences of sexual objectification and insidious trauma, further investigating the potential moderating effects of overexcitability as proposed by Dabrowski's theory of positive disintegration. Methodology: The research involved a comprehensive cohort of 204 women, spanning ages from 18 to 65 years. Participants were tasked with completing self-administered questionnaires designed to capture their experiences with sexual objectification. Additionally, the questionnaire assessed symptoms indicative of insidious trauma and explored overexcitability across five distinct domains: emotional, intellectual, psychomotor, sensory, and imaginational. Employing advanced statistical techniques, including multiple regression and moderation analysis, the study sought to decipher the intricate interplay among these variables. Findings: The study's results revealed a compelling positive correlation between experiences of sexual objectification and the onset of symptoms indicative of insidious trauma. This correlation underscores the profound and detrimental effects of sexual objectification on an individual's psychological well-being. Interestingly, the moderation analyses introduced a nuanced understanding, highlighting the differential roles of various overexcitability. Specifically, emotional, intellectual, and sensual overexcitability were found to exacerbate trauma symptomatology. In contrast, psychomotor overexcitability emerged as a protective factor, demonstrating a mitigating influence on the relationship between sexual objectification and trauma. Implications: The study's findings hold significant implications for a diverse array of stakeholders, encompassing mental health practitioners, educators, policymakers, and advocacy groups. The identified moderating effects of overexcitability emphasize the need for tailored interventions that consider individual differences in coping and resilience mechanisms. By recognizing the pivotal role of overexcitability in modulating the traumatic consequences of sexual objectification, this research advocates for the development of more nuanced and targeted support frameworks. Moreover, the study underscores the importance of continued research endeavors to unravel the intricate mechanisms and dynamics underpinning these relationships. Such endeavors are crucial for fostering the evolution of informed, evidence-based interventions and strategies aimed at mitigating the adverse effects of sexual objectification and promoting holistic well-being.

Keywords: sexual objectification, insidious trauma, emotional overexcitability, intellectual overexcitability, sensual overexcitability, psychomotor overexcitability, imaginational overexcitability

Procedia PDF Downloads 39
5762 Business-Intelligence Mining of Large Decentralized Multimedia Datasets with a Distributed Multi-Agent System

Authors: Karima Qayumi, Alex Norta

Abstract:

The rapid generation of high volume and a broad variety of data from the application of new technologies pose challenges for the generation of business-intelligence. Most organizations and business owners need to extract data from multiple sources and apply analytical methods for the purposes of developing their business. Therefore, the recently decentralized data management environment is relying on a distributed computing paradigm. While data are stored in highly distributed systems, the implementation of distributed data-mining techniques is a challenge. The aim of this technique is to gather knowledge from every domain and all the datasets stemming from distributed resources. As agent technologies offer significant contributions for managing the complexity of distributed systems, we consider this for next-generation data-mining processes. To demonstrate agent-based business intelligence operations, we use agent-oriented modeling techniques to develop a new artifact for mining massive datasets.

Keywords: agent-oriented modeling (AOM), business intelligence model (BIM), distributed data mining (DDM), multi-agent system (MAS)

Procedia PDF Downloads 415
5761 Video-On-Demand QoE Evaluation across Different Age-Groups and Its Significance for Network Capacity

Authors: Mujtaba Roshan, John A. Schormans

Abstract:

Quality of Experience (QoE) drives churn in the broadband networks industry, and good QoE plays a large part in the retention of customers. QoE is known to be affected by the Quality of Service (QoS) factors packet loss probability (PLP), delay and delay jitter caused by the network. Earlier results have shown that the relationship between these QoS factors and QoE is non-linear, and may vary from application to application. We use the network emulator Netem as the basis for experimentation, and evaluate how QoE varies as we change the emulated QoS metrics. Focusing on Video-on-Demand, we discovered that the reported QoE may differ widely for users of different age groups, and that the most demanding age group (the youngest) can require an order of magnitude lower PLP to achieve the same QoE than is required by the most widely studied age group of users. We then used a bottleneck TCP model to evaluate the capacity cost of achieving an order of magnitude decrease in PLP, and found it be (almost always) a 3-fold increase in link capacity that was required.

Keywords: network capacity, packet loss probability, quality of experience, quality of service

Procedia PDF Downloads 262
5760 A Study on Good Governance: Its Elements, Models, and Goals

Authors: Ehsan Daryadel, Hamid Shakeri

Abstract:

Good governance is considered as one of the necessary prerequisites for promotion of sustainable development programs in countries. Theoretical model of good governance is going to form the best methods for administration and management of subject country. The importance of maintaining the balance between the needs of present and future generation through sustainable development caused a change in method of management and providing service for citizens that is addressed as the most efficient and effective way of administration of countries. This method is based on democratic and equal-seeking sustainable development which is trying to affect all actors in this area and also be accountable to all citizens’ needs. Meanwhile, it should be noted that good governance is a prerequisite for sustainable development. In fact, good governance means impact of all actors on administration and management of the country for fulfilling public services, general needs of citizens and establishing a balance and harmony between needs of present and future generation. In the present study, efforts have been made to present concepts, definitions, purposes and indices of good governance with a descriptive-analytical method.

Keywords: accountability, efficiency and effectiveness, good governance, rule of law, transparency

Procedia PDF Downloads 287
5759 Using the M-Learning to Support Learning of the Concept of the Derivative

Authors: Elena F. Ruiz, Marina Vicario, Chadwick Carreto, Rubén Peredo

Abstract:

One of the main obstacles in Mexico’s engineering programs is math comprehension, especially in the Derivative concept. Due to this, we present a study case that relates Mobile Computing and Classroom Learning in the “Escuela Superior de Cómputo”, based on the Educational model of the Instituto Politécnico Nacional (competence based work and problem solutions) in which we propose apps and activities to teach the concept of the Derivative. M- Learning is emphasized as one of its lines, as the objective is the use of mobile devices running an app that uses its components such as sensors, screen, camera and processing power in classroom work. In this paper, we employed Augmented Reality (ARRoC), based on the good results this technology has had in the field of learning. This proposal was developed using a qualitative research methodology supported by quantitative research. The methodological instruments used on this proposal are: observation, questionnaires, interviews and evaluations. We obtained positive results with a 40% increase using M-Learning, from the 20% increase using traditional means.

Keywords: augmented reality, classroom learning, educational research, mobile computing

Procedia PDF Downloads 351
5758 Effect of Two Cooking Methods on Kinetics of Polyphenol Content, Flavonoid Content and Color of a Tunisian Meal: Molokheiya (Corchorus olitorius)

Authors: S. Njoumi, L. Ben Haj Said, M. J. Amiot, S. Bellagha

Abstract:

The main objective of this research was to establish the kinetics of variation of total polyphenol content (TPC) and total flavonoid content (TFC) in Tunisian Corchorus olitorius powder and in a traditional home cooked-meal (Molokheiya) when using stewing and stir-frying as cooking methods, but also to compare the effect of these two common cooking practices on water content, TPC, TFC and color. The L*, a* and b* coordinates values of the Molokheiya varied from 24.955±0.039 to 21.301±0.036, from -1.556±0.048 to 0.23±0.026 and from 5.675±0.052 to 6.313±0.103 when using stewing and from 21.328±0.025 to 20.56±0.021, from -1.093± 0.011to 0.121±0.007 and from 5.708±0.020 to 6.263±0.007 when using stir-frying, respectively. TPC and TFC increased during cooking. TPC of Molokheiya varied from 29.852±0.866 mg GAE/100 g to 220.416±0.519 mg GAE/100 g after 150 min of stewing and from 25.257±0.259 mg GAE/100 g to 208.897 ±0.173 mg GAE/100 g using stir-frying method during 150 min. TFC of Molokheiya varied from 48.229±1.47 mg QE/100 g to 843.802±1.841 mg QE/100 g when using stewing and from 37.031± 0.368 mg QE/100 g to 775.312±0.736 mg QE/100 g when using stir-frying. Kinetics followed similar curves in all cases but resulted in different final TPC and TFC. The shape of the kinetics curves suggests zero-order kinetics. The mathematical relations and the numerical approach used to model the kinetics of polyphenol and flavonoid contents in Molokheiya are described.

Keywords: Corchorus olitorius, Molokheiya, phenolic compounds, kinetic

Procedia PDF Downloads 337
5757 Development of Tensile Stress-Strain Relationship for High-Strength Steel Fiber Reinforced Concrete

Authors: H. A. Alguhi, W. A. Elsaigh

Abstract:

This paper provides a tensile stress-strain (σ-ε) relationship for High-Strength Steel Fiber Reinforced Concrete (HSFRC). Load-deflection (P-δ) behavior of HSFRC beams tested under four-point flexural load were used with inverse analysis to calculate the tensile σ-ε relationship for various tested concrete grades (70 and 90MPa) containing 60 kg/m3 (0.76 %) of hook-end steel fibers. A first estimate of the tensile (σ-ε) relationship is obtained using RILEM TC 162-TDF and other methods available in literature, frequently used for determining tensile σ-ε relationship of Normal-Strength Concrete (NSC) Non-Linear Finite Element Analysis (NLFEA) package ABAQUS® is used to model the beam’s P-δ behavior. The results have shown that an element-size dependent tensile σ-ε relationship for HSFRC can be successfully generated and adopted for further analyzes involving HSFRC structures.

Keywords: tensile stress-strain, flexural response, high strength concrete, steel fibers, non-linear finite element analysis

Procedia PDF Downloads 348
5756 Modelling of Powered Roof Supports Work

Authors: Marcin Michalak

Abstract:

Due to the increasing efforts on saving our natural environment a change in the structure of energy resources can be observed - an increasing fraction of a renewable energy sources. In many countries traditional underground coal mining loses its significance but there are still countries, like Poland or Germany, in which the coal based technologies have the greatest fraction in a total energy production. This necessitates to make an effort to limit the costs and negative effects of underground coal mining. The longwall complex is as essential part of the underground coal mining. The safety and the effectiveness of the work is strongly dependent of the diagnostic state of powered roof supports. The building of a useful and reliable diagnostic system requires a lot of data. As the acquisition of a data of any possible operating conditions it is important to have a possibility to generate a demanded artificial working characteristics. In this paper a new approach of modelling a leg pressure in the single unit of powered roof support. The model is a result of the analysis of a typical working cycles.

Keywords: machine modelling, underground mining, coal mining, structure

Procedia PDF Downloads 348
5755 Review of the Software Used for 3D Volumetric Reconstruction of the Liver

Authors: P. Strakos, M. Jaros, T. Karasek, T. Kozubek, P. Vavra, T. Jonszta

Abstract:

In medical imaging, segmentation of different areas of human body like bones, organs, tissues, etc. is an important issue. Image segmentation allows isolating the object of interest for further processing that can lead for example to 3D model reconstruction of whole organs. Difficulty of this procedure varies from trivial for bones to quite difficult for organs like liver. The liver is being considered as one of the most difficult human body organ to segment. It is mainly for its complexity, shape versatility and proximity of other organs and tissues. Due to this facts usually substantial user effort has to be applied to obtain satisfactory results of the image segmentation. Process of image segmentation then deteriorates from automatic or semi-automatic to fairly manual one. In this paper, overview of selected available software applications that can handle semi-automatic image segmentation with further 3D volume reconstruction of human liver is presented. The applications are being evaluated based on the segmentation results of several consecutive DICOM images covering the abdominal area of the human body.

Keywords: image segmentation, semi-automatic, software, 3D volumetric reconstruction

Procedia PDF Downloads 277
5754 Analysis of Financial Time Series by Using Ornstein-Uhlenbeck Type Models

Authors: Md Al Masum Bhuiyan, Maria C. Mariani, Osei K. Tweneboah

Abstract:

In the present work, we develop a technique for estimating the volatility of financial time series by using stochastic differential equation. Taking the daily closing prices from developed and emergent stock markets as the basis, we argue that the incorporation of stochastic volatility into the time-varying parameter estimation significantly improves the forecasting performance via Maximum Likelihood Estimation. While using the technique, we see the long-memory behavior of data sets and one-step-ahead-predicted log-volatility with ±2 standard errors despite the variation of the observed noise from a Normal mixture distribution, because the financial data studied is not fully Gaussian. Also, the Ornstein-Uhlenbeck process followed in this work simulates well the financial time series, which aligns our estimation algorithm with large data sets due to the fact that this algorithm has good convergence properties.

Keywords: financial time series, maximum likelihood estimation, Ornstein-Uhlenbeck type models, stochastic volatility model

Procedia PDF Downloads 226
5753 Sparse-View CT Reconstruction Based on Nonconvex L1 − L2 Regularizations

Authors: Ali Pour Yazdanpanah, Farideh Foroozandeh Shahraki, Emma Regentova

Abstract:

The reconstruction from sparse-view projections is one of important problems in computed tomography (CT) limited by the availability or feasibility of obtaining of a large number of projections. Traditionally, convex regularizers have been exploited to improve the reconstruction quality in sparse-view CT, and the convex constraint in those problems leads to an easy optimization process. However, convex regularizers often result in a biased approximation and inaccurate reconstruction in CT problems. Here, we present a nonconvex, Lipschitz continuous and non-smooth regularization model. The CT reconstruction is formulated as a nonconvex constrained L1 − L2 minimization problem and solved through a difference of convex algorithm and alternating direction of multiplier method which generates a better result than L0 or L1 regularizers in the CT reconstruction. We compare our method with previously reported high performance methods which use convex regularizers such as TV, wavelet, curvelet, and curvelet+TV (CTV) on the test phantom images. The results show that there are benefits in using the nonconvex regularizer in the sparse-view CT reconstruction.

Keywords: computed tomography, non-convex, sparse-view reconstruction, L1-L2 minimization, difference of convex functions

Procedia PDF Downloads 300
5752 Effect of Kenaf Fibres on Starch-Grafted-Polypropylene Biopolymer Properties

Authors: Amel Hamma, Allesandro Pegoretti

Abstract:

Kenaf fibres, with two aspect ratios, were melt compounded with two types of biopolymers named starch grafted polypropylene, and then blends compression molded to form plates of 1 mm thick. Results showed that processing induced variation of fibres length which is quantified by optical microscopy observations. Young modulus, stress at break and impact resistance values of starch-grafted-polypropylenes were remarkably improved by kenaf fibres for both matrixes and demonstrated best values when G906PJ were used as matrix. These results attest the good interfacial bonding between the matrix and fibres even in the absence of any interfacial modification. Vicat Softening Point and storage modules were also improved due to the reinforcing effect of fibres. Moreover, short-term tensile creep tests have proven that kenaf fibres remarkably improve the creep stability of composites. The creep behavior of the investigated materials was successfully modeled by the four parameters Burgers model.

Keywords: creep behaviour, kenaf fibres, mechanical properties, starch-grafted-polypropylene

Procedia PDF Downloads 219
5751 Application of Discrete-Event Simulation in Health Technology Assessment: A Cost-Effectiveness Analysis of Alzheimer’s Disease Treatment Using Real-World Evidence in Thailand

Authors: Khachen Kongpakwattana, Nathorn Chaiyakunapruk

Abstract:

Background: Decision-analytic models for Alzheimer’s disease (AD) have been advanced to discrete-event simulation (DES), in which individual-level modelling of disease progression across continuous severity spectra and incorporation of key parameters such as treatment persistence into the model become feasible. This study aimed to apply the DES to perform a cost-effectiveness analysis of treatment for AD in Thailand. Methods: A dataset of Thai patients with AD, representing unique demographic and clinical characteristics, was bootstrapped to generate a baseline cohort of patients. Each patient was cloned and assigned to donepezil, galantamine, rivastigmine, memantine or no treatment. Throughout the simulation period, the model randomly assigned each patient to discrete events including hospital visits, treatment discontinuation and death. Correlated changes in cognitive and behavioral status over time were developed using patient-level data. Treatment effects were obtained from the most recent network meta-analysis. Treatment persistence, mortality and predictive equations for functional status, costs (Thai baht (THB) in 2017) and quality-adjusted life year (QALY) were derived from country-specific real-world data. The time horizon was 10 years, with a discount rate of 3% per annum. Cost-effectiveness was evaluated based on the willingness-to-pay (WTP) threshold of 160,000 THB/QALY gained (4,994 US$/QALY gained) in Thailand. Results: Under a societal perspective, only was the prescription of donepezil to AD patients with all disease-severity levels found to be cost-effective. Compared to untreated patients, although the patients receiving donepezil incurred a discounted additional costs of 2,161 THB, they experienced a discounted gain in QALY of 0.021, resulting in an incremental cost-effectiveness ratio (ICER) of 138,524 THB/QALY (4,062 US$/QALY). Besides, providing early treatment with donepezil to mild AD patients further reduced the ICER to 61,652 THB/QALY (1,808 US$/QALY). However, the dominance of donepezil appeared to wane when delayed treatment was given to a subgroup of moderate and severe AD patients [ICER: 284,388 THB/QALY (8,340 US$/QALY)]. Introduction of a treatment stopping rule when the Mini-Mental State Exam (MMSE) score goes below 10 to a mild AD cohort did not deteriorate the cost-effectiveness of donepezil at the current treatment persistence level. On the other hand, none of the AD medications was cost-effective when being considered under a healthcare perspective. Conclusions: The DES greatly enhances real-world representativeness of decision-analytic models for AD. Under a societal perspective, treatment with donepezil improves patient’s quality of life and is considered cost-effective when used to treat AD patients with all disease-severity levels in Thailand. The optimal treatment benefits are observed when donepezil is prescribed since the early course of AD. With healthcare budget constraints in Thailand, the implementation of donepezil coverage may be most likely possible when being considered starting with mild AD patients, along with the stopping rule introduced.

Keywords: Alzheimer's disease, cost-effectiveness analysis, discrete event simulation, health technology assessment

Procedia PDF Downloads 112
5750 Fragility Analysis of Weir Structure Subjected to Flooding Water Damage

Authors: Oh Hyeon Jeon, WooYoung Jung

Abstract:

In this study, seepage analysis was performed by the level difference between upstream and downstream of weir structure for safety evaluation of weir structure against flooding. Monte Carlo Simulation method was employed by considering the probability distribution of the adjacent ground parameter, i.e., permeability coefficient of weir structure. Moreover, by using a commercially available finite element program (ABAQUS), modeling of the weir structure is carried out. Based on this model, the characteristic of water seepage during flooding was determined at each water level with consideration of the uncertainty of their corresponding permeability coefficient. Subsequently, fragility function could be constructed based on this response from numerical analysis; this fragility function results could be used to determine the weakness of weir structure subjected to flooding disaster. They can also be used as a reference data that can comprehensively predict the probability of failur,e and the degree of damage of a weir structure.

Keywords: weir structure, seepage, flood disaster fragility, probabilistic risk assessment, Monte-Carlo simulation, permeability coefficient

Procedia PDF Downloads 332
5749 Agri-Tourism as a Sustainable Adaptation Option for Climate Change Impacts on Small Scale Agricultural Sector

Authors: Rohana Pandukabhya Mahaliyanaarachchi, Maheshwari Sangeetha Elapatha, Mohamed Esham, Banagala Chathurika Maduwanthi

Abstract:

The global climate change has become one of the imperative issues for the smallholder dominated agricultural sector and nature based tourism sector in Sri Lanka. Thus addressing this issue is notably important. The main objective of this study was to investigate the potential of agri-tourism as a sustainable adaptation option to mitigate some of the negative impacts of climate change in small scale agricultural sector in Sri Lanka. The study was carried out in two different climatic zones in Sri Lanka namely Low Country Dry Zone and Up Country Wet Zone. A case study strategy followed by structured and unstructured interviewers through cross-sectional surveys were adapted to collect data. The study revealed that there had been a significant change in the climate in regard to the rainfall patterns in both climatic zones resulting unexpected rains during months and longer drought periods. This results the damages of agricultural production, low yields and subsequently low income. However, to mitigate these adverse effects, farmers have mainly focused on using strategies related to the crops and farming patterns rather than diversifying their business by adopting other entrepreneurial activities like agri-tourism. One of the major precursor for this was due to lesser awareness on the concept of agri-tourism within the farming community. The study revealed that the respondents of both climatic zones do have willingness and potential to adopt agri-tourism. One key important factor identified was that farming or agriculture was the main livelihood of the respondents, which is one of the vital precursor needed to start up an agri-tourism enterprise. Most of the farmers in the Up Country Wet Zone had an inclination to start a farm guest house or a farm home stay whereas the farmers in the Low Country Dry Zone wish to operate farm guest house, farm home stay or farm restaurant. They also have an interest to open up a road side farm product stall to facilitate the direct sales of the farm. Majority of the farmers in both climatic zones showed an interest to initiate an agri-tourism business as a complementary enterprise where they wished to give an equal share to both farming and agri-tourism. Thus this revealed that the farmers have identified agri-tourism as a vital concept and have given the equal importance as given to farming. This shows that most of the farmers have understood agri-tourism as an alternative income source that can mitigate the adverse effects of climatic change. This study emphasizes that agri-tourism as an alternative income source that can mitigate the adverse effects of climatic change on small scale agriculture sector.

Keywords: adaptation, agri-tourism, climate change, small scale agriculture

Procedia PDF Downloads 142
5748 The Impact of a Staff Well-Being Service for a Multi-Site Research Study

Authors: Ruth Elvish, Alex Turner, Jen Wells

Abstract:

Over recent years there has been an increasing interest in the topic of well-being at work, and staff support is an area of continued growth. The present qualitative study explored the impact of a staff well-being service that was specifically attached to a five-year multi-site research programme (the Neighbourhoods and Dementia Study, funded by the ESRC/NIHR). The well-being service was led by a clinical psychologist, who offered 1:1 sessions for staff and co-researchers with dementia. To our knowledge, this service was the first of its kind. Methodology: Interviews were undertaken with staff who had used the service and who opted to take part in the study (n=7). Thematic analysis was used as the method of analysis. Findings: Themes included: triggers, mechanisms of change, impact/outcomes, and unique aspects of a dedicated staff well-being service. Conclusions: The study highlights stressors that are pertinent amongst staff within academic settings, and shows the ways in which a dedicated staff well-being service can impact on both professional and personal lives. Positive change was seen in work performance, self-esteem, relationships, and coping. This exploratory study suggests that this well-being service model should be further trialled and evaluated.

Keywords: academic, service, staff, support, well-being

Procedia PDF Downloads 184
5747 Application of Grasshopper Optimization Algorithm for Design and Development of Net Zero Energy Residential Building in Ahmedabad, India

Authors: Debasis Sarkar

Abstract:

This paper aims to apply the Grasshopper-Optimization-Algorithm (GOA) for designing and developing a Net-Zero-Energy residential building for a mega-city like Ahmedabad in India. The methodology implemented includes advanced tools like Revit for model creation and MATLAB for simulation, enabling the optimization of the building design. GOA has been applied in reducing cooling loads and overall energy consumption through optimized passive design features. For the attainment of a net zero energy mission, solar panels were installed on the roof of the building. It has been observed that the energy consumption of 8490 kWh was supported by the installed solar panels. Thereby only 840kWh had to be supported by non-renewable energy sources. The energy consumption was further reduced through the application of simulation and optimization methods like GOA, which further reduced the energy consumption to about 37.56 kWh per month from April to July when energy demand was at its peak. This endeavor aimed to achieve near-zero-energy consumption, showcasing the potential of renewable energy integration in building sustainability.

Keywords: grasshopper optimization algorithm, net zero energy, residential building, sustainable design

Procedia PDF Downloads 4
5746 From Talk to Action-Tackling Africa’s Pollution and Climate Change Problem

Authors: Ngabirano Levis

Abstract:

One of Africa’s major environmental challenges remains air pollution. In 2017, UNICEF estimated over 400,000 children in Africa died as a result of indoor pollution, while 350 million children remain exposed to the risks of indoor pollution due to the use of biomass and burning of wood for cooking. Over time, indeed, the major causes of mortality across Africa are shifting from the unsafe water, poor sanitation, and malnutrition to the ambient and household indoor pollution, and greenhouse gas (GHG) emissions remain a key factor in this. In addition, studies by the OECD estimated that the economic cost of premature deaths due to Ambient Particulate Matter Pollution (APMP) and Household Air Pollution across Africa in 2013 was about 215 Billion US Dollars and US 232 Billion US Dollars, respectively. This is not only a huge cost for a continent where over 41% of the Sub-Saharan population lives on less than 1.9 US Dollars a day but also makes the people extremely vulnerable to the negative climate change and environmental degradation effects. Such impacts have led to extended droughts, flooding, health complications, and reduced crop yields hence food insecurity. Climate change, therefore, poses a threat to global targets like poverty reduction, health, and famine. Despite efforts towards mitigation, air contributors like carbon dioxide emissions are on a generally upward trajectory across Africa. In Egypt, for instance, emission levels had increased by over 141% in 2010 from the 1990 baseline. Efforts like the climate change adaptation and mitigation financing have also hit obstacles on the continent. The International Community and developed nations stress that Africa still faces challenges of limited human, institutional and financial systems capable of attracting climate funding from these developed economies. By using the qualitative multi-case study method supplemented by interviews of key actors and comprehensive textual analysis of relevant literature, this paper dissects the key emissions and air pollutant sources, their impact on the well-being of the African people, and puts forward suggestions as well as a remedial mechanism to these challenges. The findings reveal that whereas climate change mitigation plans appear comprehensive and good on paper for many African countries like Uganda; the lingering political interference, limited research guided planning, lack of population engagement, irrational resource allocation, and limited system and personnel capacity has largely impeded the realization of the set targets. Recommendations have been put forward to address the above climate change impacts that threaten the food security, health, and livelihoods of the people on the continent.

Keywords: Africa, air pollution, climate change, mitigation, emissions, effective planning, institutional strengthening

Procedia PDF Downloads 69
5745 Evaluation of Batch Splitting in the Context of Load Scattering

Authors: S. Wesebaum, S. Willeke

Abstract:

Production companies are faced with an increasingly turbulent business environment, which demands very high production volumes- and delivery date flexibility. If a decoupling by storage stages is not possible (e.g. at a contract manufacturing company) or undesirable from a logistical point of view, load scattering effects the production processes. ‘Load’ characterizes timing and quantity incidence of production orders (e.g. in work content hours) to workstations in the production, which results in specific capacity requirements. Insufficient coordination between load (demand capacity) and capacity supply results in heavy load scattering, which can be described by deviations and uncertainties in the input behavior of a capacity unit. In order to respond to fluctuating loads, companies try to implement consistent and realizable input behavior using the capacity supply available. For example, a uniform and high level of equipment capacity utilization keeps production costs down. In contrast, strong load scattering at workstations leads to performance loss or disproportionately fluctuating WIP, whereby the logistics objectives are affected negatively. Options for reducing load scattering are e.g. shifting the start and end dates of orders, batch splitting and outsourcing of operations or shifting to other workstations. This leads to an adjustment of load to capacity supply, and thus to a reduction of load scattering. If the adaptation of load to capacity cannot be satisfied completely, possibly flexible capacity must be used to ensure that the performance of a workstation does not decrease for a given load. Where the use of flexible capacities normally raises costs, an adjustment of load to capacity supply reduces load scattering and, in consequence, costs. In the literature you mostly find qualitative statements for describing load scattering. Quantitative evaluation methods that describe load mathematically are rare. In this article the authors discuss existing approaches for calculating load scattering and their various disadvantages such as lack of opportunity for normalization. These approaches are the basis for the development of our mathematical quantification approach for describing load scattering that compensates the disadvantages of the current quantification approaches. After presenting our mathematical quantification approach, the method of batch splitting will be described. Batch splitting allows the adaptation of load to capacity to reduce load scattering. After describing the method, it will be explicitly analyzed in the context of the logistic curve theory by Nyhuis using the stretch factor α1 in order to evaluate the impact of the method of batch splitting on load scattering and on logistic curves. The conclusion of this article will be to show how the methods and approaches presented can help companies in a turbulent environment to quantify the occurring work load scattering accurately and apply an efficient method for adjusting work load to capacity supply. In this way, the achievements of the logistical objectives are increased without causing additional costs.

Keywords: batch splitting, production logistics, production planning and control, quantification, load scattering

Procedia PDF Downloads 386
5744 Preformed Au Colloidal Nanoparticles Immobilised on NiO as Highly Efficient Heterogeneous Catalysts for Reduction of 4-Nitrophenol to 4-Aminophenol

Authors: Khaled Alshammari

Abstract:

A facile approach to synthesizing highly active and stable Au/NiO catalysts for the hydrogenation of nitro-aromatics is reported. Preformed gold nanoparticles have been immobilized onto NiO using a colloidal method. In this article, the reduction of 4-nitrophenol with NaBH4 has been used as a model reaction to investigate the catalytic activity of synthesized Au/NiO catalysts. In addition, we report a systematic study of the reduction kinetics and the influence of specific reaction parameters such as (i) temperature, (ii) stirring rate, (iii) sodium borohydride concentration and (iv) substrate/metal molar ratio. The reaction has been performed at a substrate/metal molar ratio of 7.4, a ratio significantly higher than previously reported. The reusability of the catalyst has been examined, with little to no decrease in activity observed over 5 catalytic cycles. Systematic variation of Au loading reveals the successful synthesis of low-cost and efficient Au/NiO catalysts at very low Au content and using high substrate/metal molar ratios.

Keywords: nonochemistry, catalyst, nanoparticles supported, characterization of materials, colloidal nanoparticles

Procedia PDF Downloads 41
5743 Analysis of the Impact of Refractivity on Ultra High Frequency Signal Strength over Gusau, North West, Nigeria

Authors: B. G. Ayantunji, B. Musa, H. Mai-Unguwa, L. A. Sunmonu, A. S. Adewumi, L. Sa'ad, A. Kado

Abstract:

For achieving reliable and efficient communication system, both terrestrial and satellite communication, surface refractivity is critical in planning and design of radio links. This study analyzed the impact of atmospheric parameters on Ultra High Frequency (UHF) signal strength over Gusau, North West, Nigeria. The analysis exploited meteorological data measured simultaneously with UHF signal strength for the month of June 2017 using a Davis Vantage Pro2 automatic weather station and UHF signal strength measuring devices respectively. The instruments were situated at the premise of Federal University, Gusau (6° 78' N, 12° 13' E). The refractivity values were computed using ITU-R model. The result shows that the refractivity value attained the highest value of 366.28 at 2200hr and a minimum value of 350.66 at 2100hr local time. The correlation between signal strength and refractivity is 0.350; Humidity is 0.532 and a negative correlation of -0.515 for temperature.

Keywords: refractivity, UHF (ultra high frequency) signal strength, free space, automatic weather station

Procedia PDF Downloads 183
5742 Computational Approaches for Ballistic Impact Response of Stainless Steel 304

Authors: A. Mostafa

Abstract:

This paper presents a numerical study on determination of ballistic limit velocity (V50) of stainless steel 304 (SS 304) used in manufacturing security screens. The simulated ballistic impact tests were conducted on clamped sheets with different thicknesses using ABAQUS/Explicit nonlinear finite element (FE) package. The ballistic limit velocity was determined using three approaches, namely: numerical tests based on material properties, FE calculated residual velocities and FE calculated residual energies. Johnson-Cook plasticity and failure criterion were utilized to simulate the dynamic behaviour of the SS 304 under various strain rates, while the well-known Lambert-Jonas equation was used for the data regression for the residual velocity and energy model. Good agreement between the investigated numerical methods was achieved. Additionally, the dependence of the ballistic limit velocity on the sheet thickness was observed. The proposed approaches present viable and cost-effective assessment methods of the ballistic performance of SS 304, which will support the development of robust security screen systems.

Keywords: ballistic velocity, stainless steel, numerical approaches, security screen

Procedia PDF Downloads 142
5741 Propagation of DEM Varying Accuracy into Terrain-Based Analysis

Authors: Wassim Katerji, Mercedes Farjas, Carmen Morillo

Abstract:

Terrain-Based Analysis results in derived products from an input DEM and these products are needed to perform various analyses. To efficiently use these products in decision-making, their accuracies must be estimated systematically. This paper proposes a procedure to assess the accuracy of these derived products, by calculating the accuracy of the slope dataset and its significance, taking as an input the accuracy of the DEM. Based on the output of previously published research on modeling the relative accuracy of a DEM, specifically ASTER and SRTM DEMs with Lebanon coverage as the area of study, analysis have showed that ASTER has a low significance in the majority of the area where only 2% of the modeled terrain has 50% or more significance. On the other hand, SRTM showed a better significance, where 37% of the modeled terrain has 50% or more significance. Statistical analysis deduced that the accuracy of the slope dataset, calculated on a cell-by-cell basis, is highly correlated to the accuracy of the input DEM. However, this correlation becomes lower between the slope accuracy and the slope significance, whereas it becomes much higher between the modeled slope and the slope significance.

Keywords: terrain-based analysis, slope, accuracy assessment, Digital Elevation Model (DEM)

Procedia PDF Downloads 431
5740 User Intention Generation with Large Language Models Using Chain-of-Thought Prompting Title

Authors: Gangmin Li, Fan Yang

Abstract:

Personalized recommendation is crucial for any recommendation system. One of the techniques for personalized recommendation is to identify the intention. Traditional user intention identification uses the user’s selection when facing multiple items. This modeling relies primarily on historical behaviour data resulting in challenges such as the cold start, unintended choice, and failure to capture intention when items are new. Motivated by recent advancements in Large Language Models (LLMs) like ChatGPT, we present an approach for user intention identification by embracing LLMs with Chain-of-Thought (CoT) prompting. We use the initial user profile as input to LLMs and design a collection of prompts to align the LLM's response through various recommendation tasks encompassing rating prediction, search and browse history, user clarification, etc. Our tests on real-world datasets demonstrate the improvements in recommendation by explicit user intention identification and, with that intention, merged into a user model.

Keywords: personalized recommendation, generative user modelling, user intention identification, large language models, chain-of-thought prompting

Procedia PDF Downloads 31