Search results for: operator approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 13753

Search results for: operator approach

6163 Antecedents of Perceptions About Halal Foods Among Non-Muslims in United States of America

Authors: Saira Naeem, Rana Muhammad Ayyub

Abstract:

The main objective of this study is to empirically study the antecedents of perceptions of non-Muslim consumers towards Halal foods. The questionnaire survey was conducted through surveymonkey.com from non-Muslims (n=222) of USA. The validated scales of knowledge about Halal foods, animal welfare concerns, acculturation and perception about Halal foods were adopted after necessary adaptation as measures. The structural equation modelling (SEM) approach was used to study the structural model. It was found that Knowledge about Halal foods and ongoing acculturation among non-Muslims has a positive effect on perception about Halal food whereas; animal welfare concerns have negative effect on it. Furthermore, the acculturation has moderating effects but it was found non-significant. It is recommended that Halal food marketers should increase their efforts to educate customers by updating their knowledge about it. Furthermore, it is recommended that the non-Muslim consumers must be apprised of the fact that their animal welfare concerns are adequately addressed while Halal food production and supply chain. Online data collection is the only limitation of this study. This study will guide the Halal marketers of western countries about how to market the Halal food products and services to serve the non-Muslim customers.

Keywords: non-Muslims, consumer perceptions, animal welfare concerns, acculturation, knowledge about Halal

Procedia PDF Downloads 98
6162 Sustainable Approach for Strategic Planning of Construction of Buildings using Multi-Criteria Decision Making Tools

Authors: Kishor Bhagwat, Gayatri Vyas

Abstract:

Construction industry is earmarked with complex processes depending on the nature and scope of the project. In recent years, developments in this sector are remarkable and have resulted in both positive and negative impacts on the environment and human being. Sustainable construction can be looked upon as one of the solution to overcome the negative impacts since sustainable construction is a vast concept, which includes many parameters, and sometimes the use of multi-criteria decision making [MCDM] tools becomes necessary. The main objective of this study is to determine the weightage of sustainable building parameters with the help of MCDM tools. Questionnaire survey was conducted to examine the perspective of respondents on the importance of weights of the criterion, and the respondents were architects, green building consultants, and civil engineers. This paper presents an overview of research related to Indian and international green building rating systems and MCDM. The results depict that economy, environmental health, and safety, site selection, climatic condition, etc., are important parameters in sustainable construction.

Keywords: green building, sustainability, multi-criteria decision making method [MCDM], analytical hierarchy process [AHP], technique for order preference by similarity to an ideal solution [TOPSIS], entropy

Procedia PDF Downloads 81
6161 A Multi-Criteria Model for Scheduling of Stochastic Single Machine Problem with Outsourcing and Solving It through Application of Chance Constrained

Authors: Homa Ghave, Parmis Shahmaleki

Abstract:

This paper presents a new multi-criteria stochastic mathematical model for a single machine scheduling with outsourcing allowed. There are multiple jobs processing in batch. For each batch, all of job or a quantity of it can be outsourced. The jobs have stochastic processing time and lead time and deterministic due dates arrive randomly. Because of the stochastic inherent of processing time and lead time, we use the chance constrained programming for modeling the problem. First, the problem is formulated in form of stochastic programming and then prepared in a form of deterministic mixed integer linear programming. The objectives are considered in the model to minimize the maximum tardiness and outsourcing cost simultaneously. Several procedures have been developed to deal with the multi-criteria problem. In this paper, we utilize the concept of satisfaction functions to increases the manager’s preference. The proposed approach is tested on instances where the random variables are normally distributed.

Keywords: single machine scheduling, multi-criteria mathematical model, outsourcing strategy, uncertain lead times and processing times, chance constrained programming, satisfaction function

Procedia PDF Downloads 250
6160 Exploring Reading Attitudes among Iranian English Language Teachers

Authors: Narges Nemati, Mohammadreza Fallahpour, Hossein Bozorgian

Abstract:

Reading is one of the receptive skills which has an important role in improving other skills like writing and speaking. Furthermore, language learners can acquire plenty of vocabularies and become more acquainted with written expression through reading. Also, negative attitudes toward reading can cause negligible reading comprehension, which could prompt poor performance in the English language. Considering the fact that reading instruction was discussed as a low priority skill in the field of EFL teacher education, this study attempted to investigate EFL teachers’ attitudes toward reading instruction. Therefore, to serve the purpose of this study, a mixed-method approach was utilized by inviting 100 Iranian EFL teachers who taught at English language institutes of Iran to fill out a validated questionnaire on teachers’ attitude toward reading. Subsequently, 10 participants were randomly selected for further observations and interview sessions to evaluate the differences between their stated attitude and their actual practices. The findings from analyzing questionnaires, observations, and interviews revealed that EFL teachers’ stated attitude toward reading instruction was positive; whereas, due to some reasons like lack of time, scarcity of interesting passages, and lack of interest in reading long passages, teachers did not show positive actual attitude toward teaching reading skill.

Keywords: English as foreign language classroom, English language, reading skill, teachers' attitude

Procedia PDF Downloads 140
6159 Roasting Process of Sesame Seeds Modelling Using Gene Expression Programming: A Comparative Analysis with Response Surface Methodology

Authors: Alime Cengiz, Talip Kahyaoglu

Abstract:

Roasting process has the major importance to obtain desired aromatic taste of nuts. In this study, two kinds of roasting process were applied to hulled sesame seeds - vacuum oven and hot air roasting. Efficiency of Gene Expression Programming (GEP), a new soft computing technique of evolutionary algorithm that describes the cause and effect relationships in the data modelling system, and response surface methodology (RSM) were examined in the modelling of roasting processes over a range of temperature (120-180°C) for various times (30-60 min). Color attributes (L*, a*, b*, Browning Index (BI)), textural properties (hardness and fracturability) and moisture content were evaluated and modelled by RSM and GEP. The GEP-based formulations and RSM approach were compared with experimental results and evaluated according to correlation coefficients. The results showed that both GEP and RSM were found to be able to adequately learn the relation between roasting conditions and physical and textural parameters of roasted seeds. However, GEP had better prediction performance than the RSM with the high correlation coefficients (R2 >0.92) for the all quality parameters. This result indicates that the soft computing techniques have better capability for describing the physical changes occuring in sesame seeds during roasting process.

Keywords: genetic expression programming, response surface methodology, roasting, sesame seed

Procedia PDF Downloads 407
6158 Development of pH Responsive Nanoparticles for Colon Targeted Drug Delivery System

Authors: V. Balamuralidhara

Abstract:

The aim of the present work was to develop Paclitaxel loaded polyacrylamide grafted guar gum nanoparticles as pH responsive nanoparticle systems for targeting colon. The pH sensitive nanoparticles were prepared by modified ionotropic gelation technique. The prepared nanoparticles showed mean diameters in the range of 264±0.676 nm to 726±0.671nm, and a negative net charge 10.8 mV to 35.4mV. Fourier Transformed Infrared Spectroscopy (FT-IR) and Differential Scanning Calorimetry (DSC) studies suggested that there was no chemical interaction between drug and polymers. The encapsulation efficiency of the drug was found to be 40.92% to 48.14%. The suitability of the polyacrylamide grafted guar gum ERN’s for the release of Paclitaxel was studied by in vitro release at pH 1.2 and 7.4. It was observed that, there was no significant amount of drug release at gastric pH and 97.63% of drug release at pH 7.4 was obtained for optimized formulation F3 at the end of 12 hrs. In vivo drug targeting performance for the prepared optimized formulation (F3) and pure drug Paclitaxel was evaluated by HPLC. It was observed that the polyacrylamide grafted guar gum can be used to prepare nanoparticles for targeting the drug to the colon. The release performance was greatly affected by the materials used in ERN’s preparation, which allows maximum release at colon’s pH. It may be concluded that polyacrylamide grafted guar gum nanoparticles loaded with paclitaxel have desirable release responsive to specific pH. Hence it is a unique approach for colonic delivery of drug having appropriate site specificity and feasibility and controlled release of drug.

Keywords: colon targeting, polyacrylamide grafted guar gum nanoparticles, paclitaxel, nanoparticles

Procedia PDF Downloads 341
6157 Dynamic Software Product Lines for Customer Centric Context Aware Business Process Management

Authors: Bochra Khiari, Lamia Labed

Abstract:

In the new digital marketplace, organizations are striving for a proactive position by leveraging the great potential of disruptive technologies to seize the full opportunity of the digital revolution in order to reshape their customer value propositions. New technologies such as big data analytics, which provide prediction of future events based on real-time information, are being integrated into BPM which urges the need for additional core values like capabilities for dynamic adaptation, autonomic behavior, runtime reconfiguration and post-deployment activities to manage unforeseen scenarios at runtime in a situated and changeable context. Dynamic Software Product Lines (DSPL) is an emerging paradigm that supports these runtime variability mechanisms. However, few works exploiting DSPLs principles and techniques in the BPM domain have been proposed so far. In this paper, we propose a conceptual approach DynPL4CBPM, which integrates DSPLs concepts along with the entire related dynamic properties, to the whole BPM lifecycle in order to dynamically adapt business processes according to different context conditions in an individual environment.

Keywords: adaptive processes, context aware business process management, customer centric business process management, dynamic software product lines

Procedia PDF Downloads 147
6156 HcDD: The Hybrid Combination of Disk Drives in Active Storage Systems

Authors: Shu Yin, Zhiyang Ding, Jianzhong Huang, Xiaojun Ruan, Xiaomin Zhu, Xiao Qin

Abstract:

Since large-scale and data-intensive applications have been widely deployed, there is a growing demand for high-performance storage systems to support data-intensive applications. Compared with traditional storage systems, next-generation systems will embrace dedicated processor to reduce computational load of host machines and will have hybrid combinations of different storage devices. The advent of flash- memory-based solid state disk has become a critical role in revolutionizing the storage world. However, instead of simply replacing the traditional magnetic hard disk with the solid state disk, it is believed that finding a complementary approach to corporate both of them is more challenging and attractive. This paper explores an idea of active storage, an emerging new storage configuration, in terms of the architecture and design, the parallel processing capability, the cooperation of other machines in cluster computing environment, and a disk configuration, the hybrid combination of different types of disk drives. Experimental results indicate that the proposed HcDD achieves better I/O performance and longer storage system lifespan.

Keywords: arallel storage system, hybrid storage system, data inten- sive, solid state disks, reliability

Procedia PDF Downloads 428
6155 The Theory of Domination at the Bane of Conflict Resolution and Peace Building Processes in Cameroon

Authors: Nkatow Mafany Christian

Abstract:

According to UNHCR’s annual Database, humanitarian crises have globally been on the increase since the beginning of the 21st Century, especially in the Middle East and in Sub-Saharan Africa. Cameroon is one of the countries that has suffered tremendously from humanitarian challenges in recent years, especially with crises in the Far North, the East and its Two English-speaking Regions. These have been a result of failed mechanisms in conflict resolution peacebuilding by the government. The paper draws from this basic premise to argue that the failure to reach a consensus in order to curb internal conflicts has largely been due to the government’s attachment to the domineering attitude which emphasizes an imposition of peace terms by a superordinate (government) agency on the subordinate (aggrieved) entities. This has stalled peace efforts that have so far been engaged to address the dreaded armed conflicts in the North and South West Regions, leading to the persistence of the armed conflict. The paper exploits written, oral and online sources to sustain its argument. It suggests that an eclectic approach to resolving conflicts, which emphasizes open and frank dialogue as well as a review of the root causes, can go a long way not only to build trust but also to address the Anglophone-Cameroonian problems in Cameroon.

Keywords: conflict, conflict resolution, peace building, humanitarian crisis

Procedia PDF Downloads 52
6154 Providing a Suitable Model for Launching New Home Appliances Products to the Market

Authors: Ebrahim Sabermaash Eshghi, Donna Sandsmark

Abstract:

In changing modern economic conditions of the world, one the most important issues facing managers of firms, is increasing the sales and profitability through sales of newly developed products. This is while purpose of decreasing unnecessary costs is one of the most essential programs of smart managers for more implementation with new conditions in current business. In modern life, condition of misgiving is dominant in all of the industries. Accordingly, in this research, influence of different aspects of presenting products to the market is investigated. This study is done through a Quantitative-Qualitative (Interviews and Questionnaire) approach. In sum, 103 of informed managers and experts of Pars-Khazar Company have been examined through census. Validity of measurement tools was approved through judgments of experts. Reliability of tools was gained through Cronbach's alpha coefficient in size of 0.930 and in sum, validity and reliability of tools were approved generally. Results of regression test revealed that the influence of all aspects of product introduction supported the performance of product, positively and significantly. In addition that influence of two new factors raised from the interview, namely Human Resource Management and Management of product’s pre-test on performance of products was approved.

Keywords: introducing products, performance, home appliances, price, advertisement, production

Procedia PDF Downloads 199
6153 Characterizing and Developing the Clinical Grade Microbiome Assay with a Robust Bioinformatics Pipeline for Supporting Precision Medicine Driven Clinical Development

Authors: Danyi Wang, Andrew Schriefer, Dennis O'Rourke, Brajendra Kumar, Yang Liu, Fei Zhong, Juergen Scheuenpflug, Zheng Feng

Abstract:

Purpose: It has been recognized that the microbiome plays critical roles in disease pathogenesis, including cancer, autoimmune disease, and multiple sclerosis. To develop a clinical-grade assay for exploring microbiome-derived clinical biomarkers across disease areas, a two-phase approach is implemented. 1) Identification of the optimal sample preparation reagents using pre-mixed bacteria and healthy donor stool samples coupled with proprietary Sigma-Aldrich® bioinformatics solution. 2) Exploratory analysis of patient samples for enabling precision medicine. Study Procedure: In phase 1 study, we first compared the 16S sequencing results of two ATCC® microbiome standards (MSA 2002 and MSA 2003) across five different extraction kits (Kit A, B, C, D & E). Both microbiome standards samples were extracted in triplicate across all extraction kits. Following isolation, DNA quantity was determined by Qubit assay. DNA quality was assessed to determine purity and to confirm extracted DNA is of high molecular weight. Bacterial 16S ribosomal ribonucleic acid (rRNA) amplicons were generated via amplification of the V3/V4 hypervariable region of the 16S rRNA. Sequencing was performed using a 2x300 bp paired-end configuration on the Illumina MiSeq. Fastq files were analyzed using the Sigma-Aldrich® Microbiome Platform. The Microbiome Platform is a cloud-based service that offers best-in-class 16S-seq and WGS analysis pipelines and databases. The Platform and its methods have been extensively benchmarked using microbiome standards generated internally by MilliporeSigma and other external providers. Data Summary: The DNA yield using the extraction kit D and E is below the limit of detection (100 pg/µl) of Qubit assay as both extraction kits are intended for samples with low bacterial counts. The pre-mixed bacterial pellets at high concentrations with an input of 2 x106 cells for MSA-2002 and 1 x106 cells from MSA-2003 were not compatible with the kits. Among the remaining 3 extraction kits, kit A produced the greatest yield whereas kit B provided the least yield (Kit-A/MSA-2002: 174.25 ± 34.98; Kit-A/MSA-2003: 179.89 ± 30.18; Kit-B/MSA-2002: 27.86 ± 9.35; Kit-B/MSA-2003: 23.14 ± 6.39; Kit-C/MSA-2002: 55.19 ± 10.18; Kit-C/MSA-2003: 35.80 ± 11.41 (Mean ± SD)). Also, kit A produced the greatest yield, whereas kit B provided the least yield. The PCoA 3D visualization of the Weighted Unifrac beta diversity shows that kits A and C cluster closely together while kit B appears as an outlier. The kit A sequencing samples cluster more closely together than both the other kits. The taxonomic profiles of kit B have lower recall when compared to the known mixture profiles indicating that kit B was inefficient at detecting some of the bacteria. Conclusion: Our data demonstrated that the DNA extraction method impacts DNA concentration, purity, and microbial communities detected by next-generation sequencing analysis. Further microbiome analysis performance comparison of using healthy stool samples is underway; also, colorectal cancer patients' samples will be acquired for further explore the clinical utilities. Collectively, our comprehensive qualification approach, including the evaluation of optimal DNA extraction conditions, the inclusion of positive controls, and the implementation of a robust qualified bioinformatics pipeline, assures accurate characterization of the microbiota in a complex matrix for deciphering the deep biology and enabling precision medicine.

Keywords: 16S rRNA sequencing, analytical validation, bioinformatics pipeline, metagenomics

Procedia PDF Downloads 145
6152 Regional Flood-Duration-Frequency Models for Norway

Authors: Danielle M. Barna, Kolbjørn Engeland, Thordis Thorarinsdottir, Chong-Yu Xu

Abstract:

Design flood values give estimates of flood magnitude within a given return period and are essential to making adaptive decisions around land use planning, infrastructure design, and disaster mitigation. Often design flood values are needed at locations with insufficient data. Additionally, in hydrologic applications where flood retention is important (e.g., floodplain management and reservoir design), design flood values are required at different flood durations. A statistical approach to this problem is a development of a regression model for extremes where some of the parameters are dependent on flood duration in addition to being covariate-dependent. In hydrology, this is called a regional flood-duration-frequency (regional-QDF) model. Typically, the underlying statistical distribution is chosen to be the Generalized Extreme Value (GEV) distribution. However, as the support of the GEV distribution depends on both its parameters and the range of the data, special care must be taken with the development of the regional model. In particular, we find that the GEV is problematic when developing a GAMLSS-type analysis due to the difficulty of proposing a link function that is independent of the unknown parameters and the observed data. We discuss these challenges in the context of developing a regional QDF model for Norway.

Keywords: design flood values, bayesian statistics, regression modeling of extremes, extreme value analysis, GEV

Procedia PDF Downloads 61
6151 High Order Block Implicit Multi-Step (Hobim) Methods for the Solution of Stiff Ordinary Differential Equations

Authors: J. P. Chollom, G. M. Kumleng, S. Longwap

Abstract:

The search for higher order A-stable linear multi-step methods has been the interest of many numerical analysts and has been realized through either higher derivatives of the solution or by inserting additional off step points, supper future points and the likes. These methods are suitable for the solution of stiff differential equations which exhibit characteristics that place a severe restriction on the choice of step size. It becomes necessary that only methods with large regions of absolute stability remain suitable for such equations. In this paper, high order block implicit multi-step methods of the hybrid form up to order twelve have been constructed using the multi-step collocation approach by inserting one or more off step points in the multi-step method. The accuracy and stability properties of the new methods are investigated and are shown to yield A-stable methods, a property desirable of methods suitable for the solution of stiff ODE’s. The new High Order Block Implicit Multistep methods used as block integrators are tested on stiff differential systems and the results reveal that the new methods are efficient and compete favourably with the state of the art Matlab ode23 code.

Keywords: block linear multistep methods, high order, implicit, stiff differential equations

Procedia PDF Downloads 341
6150 Stochastic Prioritization of Dependent Actuarial Risks: Preferences among Prospects

Authors: Ezgi Nevruz, Kasirga Yildirak, Ashis SenGupta

Abstract:

Comparing or ranking risks is the main motivating factor behind the human trait of making choices. Cumulative prospect theory (CPT) is a preference theory approach that evaluates perception and bias in decision making under risk and uncertainty. We aim to investigate the aggregate claims of different risk classes in terms of their comparability and amenability to ordering when the impact of risk perception is considered. For this aim, we prioritize the aggregate claims taken as actuarial risks by using various stochastic ordering relations. In order to prioritize actuarial risks, we use stochastic relations such as stochastic dominance and stop-loss dominance that are proposed in the frame of partial order theory. We take into account the dependency of the individual claims exposed to similar environmental risks. At first, we modify the zero-utility premium principle in order to obtain a solution for the stop-loss premium under CPT. Then, we propose a stochastic stop-loss dominance of the aggregate claims and find a relation between the stop-loss dominance and the first-order stochastic dominance under the dependence assumption by using properties of the familiar as well as some emerging multivariate claim distributions.

Keywords: cumulative prospect theory, partial order theory, risk perception, stochastic dominance, stop-loss dominance

Procedia PDF Downloads 308
6149 The Advancements of Transformer Models in Part-of-Speech Tagging System for Low-Resource Tigrinya Language

Authors: Shamm Kidane, Ibrahim Abdella, Fitsum Gaim, Simon Mulugeta, Sirak Asmerom, Natnael Ambasager, Yoel Ghebrihiwot

Abstract:

The call for natural language processing (NLP) systems for low-resource languages has become more apparent than ever in the past few years, with the arduous challenges still present in preparing such systems. This paper presents an improved dataset version of the Nagaoka Tigrinya Corpus for Parts-of-Speech (POS) classification system in the Tigrinya language. The size of the initial Nagaoka dataset was incremented, totaling the new tagged corpus to 118K tokens, which comprised the 12 basic POS annotations used previously. The additional content was also annotated manually in a stringent manner, followed similar rules to the former dataset and was formatted in CONLL format. The system made use of the novel approach in NLP tasks and use of the monolingually pre-trained TiELECTRA, TiBERT and TiRoBERTa transformer models. The highest achieved score is an impressive weighted F1-score of 94.2%, which surpassed the previous systems by a significant measure. The system will prove useful in the progress of NLP-related tasks for Tigrinya and similarly related low-resource languages with room for cross-referencing higher-resource languages.

Keywords: Tigrinya POS corpus, TiBERT, TiRoBERTa, conditional random fields

Procedia PDF Downloads 78
6148 Fuzzy and Fuzzy-PI Controller for Rotor Speed of Gas Turbine

Authors: Mandar Ghodekar, Sharad Jadhav, Sangram Jadhav

Abstract:

Speed control of rotor during startup and under varying load conditions is one of the most difficult tasks of gas turbine operation. In this paper, power plant gas turbine (GE9001E) is considered for this purpose and fuzzy and fuzzy-PI rotor speed controllers are designed. The goal of the presented controllers is to keep the turbine rotor speed within predefined limits during startup condition as well as during operating condition. The fuzzy controller and fuzzy-PI controller are designed using Takagi-Sugeno method and Mamdani method, respectively. In applying the fuzzy-PI control to a gas-turbine plant, the tuning parameters (Kp and Ki) are modified online by fuzzy logic approach. Error and rate of change of error are inputs and change in fuel flow is output for both the controllers. Hence, rotor speed of gas turbine is controlled by modifying the fuel ƒflow. The identified linear ARX model of gas turbine is considered while designing the controllers. For simulations, demand power is taken as disturbance input. It is assumed that inlet guide vane (IGV) position is fixed. In addition, the constraint on the fuel flow is taken into account. The performance of the presented controllers is compared with each other as well as with H∞ robust and MPC controllers for the same operating conditions in simulations.

Keywords: gas turbine, fuzzy controller, fuzzy PI controller, power plant

Procedia PDF Downloads 320
6147 Two Efficient Heuristic Algorithms for the Integrated Production Planning and Warehouse Layout Problem

Authors: Mohammad Pourmohammadi Fallah, Maziar Salahi

Abstract:

In the literature, a mixed-integer linear programming model for the integrated production planning and warehouse layout problem is proposed. To solve the model, the authors proposed a Lagrangian relax-and-fix heuristic that takes a significant amount of time to stop with gaps above 5$\%$ for large-scale instances. Here, we present two heuristic algorithms to solve the problem. In the first one, we use a greedy approach by allocating warehouse locations with less reservation costs and also less transportation costs from the production area to locations and from locations to the output point to items with higher demands. Then a smaller model is solved. In the second heuristic, first, we sort items in descending order according to the fraction of the sum of the demands for that item in the time horizon plus the maximum demand for that item in the time horizon and the sum of all its demands in the time horizon. Then we categorize the sorted items into groups of 3, 4, or 5 and solve a small-scale optimization problem for each group, hoping to improve the solution of the first heuristic. Our preliminary numerical results show the effectiveness of the proposed heuristics.

Keywords: capacitated lot-sizing, warehouse layout, mixed-integer linear programming, heuristics algorithm

Procedia PDF Downloads 179
6146 Tangible Losses, Intangible Traumas: Re-envisioning Recovery Following the Lytton Creek Fire 2021 through Place Attachment Lens

Authors: Tugba Altin

Abstract:

In an era marked by pronounced climate change consequences, communities are observed to confront traumatic events that yield both tangible and intangible repercussions. Such events not only cause discernible damage to the landscape but also deeply affect the intangible aspects, including emotional distress and disruptions to cultural landscapes. The Lytton Creek Fire of 2021 serves as a case in point. Beyond the visible destruction, the less overt but profoundly impactful disturbance to place attachment (PA) is scrutinized. PA, representing the emotional and cognitive bonds individuals establish with their environments, is crucial for understanding how such events impact cultural identity and connection to the land. The study underscores the significance of addressing both tangible and intangible traumas for holistic community recovery. As communities renegotiate their affiliations with altered environments, the cultural landscape emerges as instrumental in shaping place-based identities. This renewed understanding is pivotal for reshaping adaptation planning. The research advocates for adaptation strategies rooted in the lived experiences and testimonies of the affected populations. By incorporating both the tangible and intangible facets of trauma, planning efforts are suggested to be more culturally attuned and emotionally insightful, fostering true resonance with the affected communities. Through such a comprehensive lens, this study contributes enriching the climate change discourse, emphasizing the intertwined nature of tangible recovery and the imperative of emotional and cultural healing after environmental disasters. Following the pronounced aftermath of the Lytton Creek Fire in 2021, research aims to deeply understand its impact on place attachment (PA), encompassing the emotional and cognitive bonds individuals form with their environments. The interpretive phenomenological approach, enriched by a hermeneutic framework, is adopted, emphasizing the experiences of the Lytton community and co-researchers. Phenomenology informed the understanding of 'place' as the focal point of attachment, providing insights into its formation and evolution after traumatic events. Data collection departs from conventional methods. Instead of traditional interviews, walking audio sessions and photo elicitation methods are utilized. These allow co-researchers to immerse themselves in the environment, re-experience, and articulate memories and feelings in real-time. Walking audio facilitates reflections on spatial narratives post-trauma, while photo voices captured intangible emotions, enabling the visualization of place-based experiences. The analysis is collaborative, ensuring the co-researchers' experiences and interpretations are central. Emphasizing their agency in knowledge production, the process is rigorous, facilitated by the harmonious blend of interpretive phenomenology and hermeneutic insights. The findings underscore the need for adaptation and recovery efforts to address emotional traumas alongside tangible damages. By exploring PA post-disaster, the research not only fills a significant gap but advocates for an inclusive approach to community recovery. Furthermore, the participatory methodologies employed challenge traditional research paradigms, heralding potential shifts in qualitative research norms.

Keywords: wildfire recovery, place attachment, trauma recovery, cultural landscape, visual methodologies

Procedia PDF Downloads 63
6145 Probabilistic Model for Evaluating Seismic Soil Liquefaction Based on Energy Approach

Authors: Hamid Rostami, Ali Fallah Yeznabad, Mohammad H. Baziar

Abstract:

The energy-based method for evaluating seismic soil liquefaction has two main sections. First is the demand energy, which is dissipated energy of earthquake at a site, and second is the capacity energy as a representation of soil resistance against liquefaction hazard. In this study, using a statistical analysis of recorded data by 14 down-hole array sites in California, an empirical equation was developed to estimate the demand energy at sites. Because determination of capacity energy at a site needs to calculate several site calibration factors, which are obtained by experimental tests, in this study the standard penetration test (SPT) N-value was assumed as an alternative to the capacity energy at a site. Based on this assumption, the empirical equation was employed to calculate the demand energy for 193 liquefied and no-liquefied sites and then these amounts were plotted versus the corresponding SPT numbers for all sites. Subsequently, a discrimination analysis was employed to determine the equations of several boundary curves for various liquefaction likelihoods. Finally, a comparison was made between the probabilistic model and the commonly used stress method. As a conclusion, the results clearly showed that energy-based method can be more reliable than conventional stress-based method in evaluation of liquefaction occurrence.

Keywords: energy demand, liquefaction, probabilistic analysis, SPT number

Procedia PDF Downloads 352
6144 Framework to Quantify Customer Experience

Authors: Anant Sharma, Ashwin Rajan

Abstract:

Customer experience is measured today based on defining a set of metrics and KPIs, setting up thresholds and defining triggers across those thresholds. While this is an effective way of measuring against a Key Performance Indicator ( referred to as KPI in the rest of the paper ), this approach cannot capture the various nuances that make up the overall customer experience. Customers consume a product or service at various levels, which is not reflected in metrics like Customer Satisfaction or Net Promoter Score, but also across other measurements like recurring revenue, frequency of service usage, e-learning and depth of usage. Here we explore an alternative method of measuring customer experience by flipping the traditional views. Rather than rolling customers up to a metric, we roll up metrics to hierarchies and then measure customer experience. This method allows any team to quantify customer experience across multiple touchpoints in a customer’s journey. We make use of various data sources which contain information for metrics like CXSAT, NPS, Renewals, and depths of service usage collected across a customer lifecycle. This data can be mined systematically to get linkages between different data points like geographies, business groups, products and time. Additional views can be generated by blending synthetic contexts into the data to show trends and top/bottom types of reports. We have created a framework that allows us to measure customer experience using the above logic.

Keywords: analytics, customers experience, BI, business operations, KPIs, metrics

Procedia PDF Downloads 58
6143 Ilorin Traditional Architecture as a Good Example of a Green Building Design

Authors: Olutola Funmilayo Adekeye

Abstract:

Tradition African practice of architecture can be said to be deeply rooted in Green Architecture in concept, design and execution. A study into the ancient building techniques in Ilorin Emirate depicts prominent (eco-centric approach of) Green Architecture principles. In the Pre-colonial era before the introduction of modern architecture and Western building materials, the Nigeria traditional communities built their houses to meet their cultural, religious and social needs using mainly indigenous building materials such as mud (Amo), cowdung (Boto), straws (koriko), palm fronts (Imo-Ope) to mention a few. This research attempts to identify the various techniques of applying the traditional African principles of Green Architecture to Ilorin traditional buildings. It will examine and assess some case studies to understand the extent to which Green architecture principles have been applied to traditional building designs that are still preserved today in Ilorin, Nigeria. Furthermore, this study intends to answer many questions, which can be summarized into two basic questions which are: (1) What aspects of what today are recognized as important green architecture principles have been applied to Ilorin traditional buildings? (2) To what extent have the principles of green architecture applied to Ilorin traditional buildings been ways of demonstrating a cultural attachment to the earth as an expression of the African sense of human being as one with nature?

Keywords: green architecture, Ilorin, traditional buildings, design principles, ecocentric, application

Procedia PDF Downloads 526
6142 Challenges Caused by the Integration of Technology as a Pedagogy in One of the Historically Disadvantaged Higher Education Institutions

Authors: Rachel Gugu Mkhasibe

Abstract:

Incorporation of technology as a pedagogy has many benefits. For instance, improvement of pedagogy, increased information access, increased cooperation, and collaboration. However, as good as it may be, this integration of technology as a pedagogy has not been widely adopted in most historically Black higher education institutions especially those in developing countries. For example, the socioeconomic background of students in historically black universities, the weak financial support available from these universities, as well as a large population of students struggle to access the recommended modern physical resources such as iPads, laptops, mobile phones, to name a few. This contributes to an increase in the increase of educational inequalities. The qualitative research approach was utilized in this work to gather detailed data about the obstacles created by the integration of technology as a pedagogy. Interviews were conducted to generate data from 20 academics from 10 Leve two students from one of the historically disadvantaged higher education Institutions in South Africa. The findings revealed that although both students and academics had overwhelming support of the integration of technology as a pedagogy in their institution, the environment which they found themselves in compromise the incorporation of technology as a pedagogy. Therefore, this paper recommends that Department of Higher Education and University Management should intervene and budget for technology to be provided in all the institutions of higher education regardless of where the institutions are situated.

Keywords: collaboration, integration, pedagogy, technology

Procedia PDF Downloads 66
6141 Contemplating Charge Transport by Modeling of DNA Nucleobases Based Nano Structures

Authors: Rajan Vohra, Ravinder Singh Sawhney, Kunwar Partap Singh

Abstract:

Electrical charge transport through two basic strands thymine and adenine of DNA have been investigated and analyzed using the jellium model approach. The FFT-2D computations have been performed for semi-empirical Extended Huckel Theory using atomistic tool kit to contemplate the charge transport metrics like current and conductance. The envisaged data is further evaluated in terms of transmission spectrum, HOMO-LUMO Gap and number of electrons. We have scrutinized the behavior of the devices in the range of -2V to 2V for a step size of 0.2V. We observe that both thymine and adenine can act as molecular devices when sandwiched between two gold probes. A prominent observation is a drop in HLGs of adenine and thymine when working as a device as compared to their intrinsic values and this is comparative more visible in case of adenine. The current in the thymine based device exhibit linear increase with voltage in spite of having low conductance. Further, the broader transmission peaks represent the strong coupling of electrodes to the scattering molecule (thymine). Moreover, the observed current in case of thymine is almost 3-4 times than that of observed for adenine. The NDR effect has been perceived in case of adenine based device for higher bias voltages and can be utilized in various future electronics applications.

Keywords: adenine, DNA, extended Huckel, thymine, transmission spectra

Procedia PDF Downloads 137
6140 Platooning Method Using Dynamic Correlation of Destination Vectors in Urban Areas

Authors: Yuya Tanigami, Naoaki Yamanaka, Satoru Okamoto

Abstract:

Economic losses due to delays in traffic congestion regarding urban transportation networks have become a more serious social problem as traffic volume increases. Platooning has recently been attracting attention from many researchers to alleviate traffic jams, especially on the highway. On highways, platooning can have positive effects, such as reducing inter-vehicular distance and reducing air resistance. However, the impacts of platooning on urban roads have not been addressed in detail since traffic lights may break the platoons. In this study, we propose a platooning method using L2 norm and cosine similarity to form a platoon with highly similar routes. Also, we investigate the sorting method within a platoon according to each vehicle’s straightness. Our proposed sorting platoon method, which uses two lanes, eliminates Head of Line Blocking at the intersection and improves throughput at intersections. This paper proposes a cyber-physical system (CPS) approach to collaborative urban platoon control. We conduct simulations using the traffic simulator SUMO and the road network, which imitates Manhattan Island. Results from the SUMO confirmed that our method shortens the average travel time by 10-20%. This paper shows the validity of forming a platoon based on destination vectors and sorting vehicles within a platoon.

Keywords: CPS, platooning, connected car, vector correlation

Procedia PDF Downloads 59
6139 Air Quality Analysis Using Machine Learning Models Under Python Environment

Authors: Salahaeddine Sbai

Abstract:

Air quality analysis using machine learning models is a method employed to assess and predict air pollution levels. This approach leverages the capabilities of machine learning algorithms to analyze vast amounts of air quality data and extract valuable insights. By training these models on historical air quality data, they can learn patterns and relationships between various factors such as weather conditions, pollutant emissions, and geographical features. The trained models can then be used to predict air quality levels in real-time or forecast future pollution levels. This application of machine learning in air quality analysis enables policymakers, environmental agencies, and the general public to make informed decisions regarding health, environmental impact, and mitigation strategies. By understanding the factors influencing air quality, interventions can be implemented to reduce pollution levels, mitigate health risks, and enhance overall air quality management. Climate change is having significant impacts on Morocco, affecting various aspects of the country's environment, economy, and society. In this study, we use some machine learning models under python environment to predict and analysis air quality change over North of Morocco to evaluate the climate change impact on agriculture.

Keywords: air quality, machine learning models, pollution, pollutant emissions

Procedia PDF Downloads 76
6138 Detection of Cyberattacks on the Metaverse Based on First-Order Logic

Authors: Sulaiman Al Amro

Abstract:

There are currently considerable challenges concerning data security and privacy, particularly in relation to modern technologies. This includes the virtual world known as the Metaverse, which consists of a virtual space that integrates various technologies and is therefore susceptible to cyber threats such as malware, phishing, and identity theft. This has led recent studies to propose the development of Metaverse forensic frameworks and the integration of advanced technologies, including machine learning for intrusion detection and security. In this context, the application of first-order logic offers a formal and systematic approach to defining the conditions of cyberattacks, thereby contributing to the development of effective detection mechanisms. In addition, formalizing the rules and patterns of cyber threats has the potential to enhance the overall security posture of the Metaverse and, thus, the integrity and safety of this virtual environment. The current paper focuses on the primary actions employed by avatars for potential attacks, including Interval Temporal Logic (ITL) and behavior-based detection to detect an avatar’s abnormal activities within the Metaverse. The research established that the proposed framework attained an accuracy of 92.307%, resulting in the experimental results demonstrating the efficacy of ITL, including its superior performance in addressing the threats posed by avatars within the Metaverse domain.

Keywords: security, privacy, metaverse, cyberattacks, detection, first-order logic

Procedia PDF Downloads 26
6137 Identification of Disease Causing DNA Motifs in Human DNA Using Clustering Approach

Authors: G. Tamilpavai, C. Vishnuppriya

Abstract:

Studying DNA (deoxyribonucleic acid) sequence is useful in biological processes and it is applied in the fields such as diagnostic and forensic research. DNA is the hereditary information in human and almost all other organisms. It is passed to their generations. Earlier stage detection of defective DNA sequence may lead to many developments in the field of Bioinformatics. Nowadays various tedious techniques are used to identify defective DNA. The proposed work is to analyze and identify the cancer-causing DNA motif in a given sequence. Initially the human DNA sequence is separated as k-mers using k-mer separation rule. The separated k-mers are clustered using Self Organizing Map (SOM). Using Levenshtein distance measure, cancer associated DNA motif is identified from the k-mer clusters. Experimental results of this work indicate the presence or absence of cancer causing DNA motif. If the cancer associated DNA motif is found in DNA, it is declared as the cancer disease causing DNA sequence. Otherwise the input human DNA is declared as normal sequence. Finally, elapsed time is calculated for finding the presence of cancer causing DNA motif using clustering formation. It is compared with normal process of finding cancer causing DNA motif. Locating cancer associated motif is easier in cluster formation process than the other one. The proposed work will be an initiative aid for finding genetic disease related research.

Keywords: bioinformatics, cancer motif, DNA, k-mers, Levenshtein distance, SOM

Procedia PDF Downloads 174
6136 Document Analysis for Modelling iTV Advertising towards Impulse Purchase

Authors: Azizah Che Omar

Abstract:

The study provides a systematic literature review which analyzed the literature for the purpose of looking for concepts, theories, approaches and guidelines in order to propose a conceptual design model of interactive television advertising toward impulse purchase (iTVAdIP). An extensive review of literature was purposely carried out to understand the concepts of interactive television (iTV). Therefore, some elements; iTV guidelines, advertising theories, persuasive approaches, and the impulse purchase elements were analyzed to reach the scope of this work. The extensive review was also a necessity to achieve the objective of this study, which was to determine the concept of iTVAdIP design model. Through systematic review analysis, this study discovered that all the previous models did not emphasize the conceptual design model of interactive television advertising. As a result, the finding showed that the concept of the proposed model should contain the iTV guidelines, advertising theory, persuasive approach and impulse purchase elements. In addition, a summary diagram for the development of the proposed model is depicted to provide clearer understanding towards the concepts of conceptual design model of iTVAdIP.

Keywords: impulse purchase, interactive television advertising, human computer interaction, advertising theories

Procedia PDF Downloads 354
6135 Study of Sub-Surface Flow in an Unconfined Carbonate Aquifer in a Tropical Karst Area in Indonesia: A Modeling Approach Using Finite Difference Groundwater Model

Authors: Dua K. S. Y. Klaas, Monzur A. Imteaz, Ika Sudiayem, Elkan M. E. Klaas, Eldav C. M. Klaas

Abstract:

Due to its porous nature, karst terrains – geomorphologically developed from dissolved formations, is vulnerable to water shortage and deteriorated water quality. Therefore, a solid comprehension on sub-surface flow of karst landscape is essential to assess the long-term availability of groundwater resources. In this paper, a single-continuum model using a finite difference model, MODLFOW, was constructed to represent an unconfined carbonate aquifer in a tropical karst island of Rote in Indonesia. The model, spatially discretized in 20 x 20 m grid cells, was calibrated and validated using available groundwater level and atmospheric variables. In the calibration and validation steps, Parameter Estimation (PEST) and geostatistical pilot point methods were employed to estimate hydraulic conductivity and specific yield values. The results show that the model is able to represent the sub-surface flow indicated by good model performances both in calibration and validation steps. The final model can be used as a robust representation of the system for future study on climate and land use scenarios.

Keywords: carbonate aquifer, karst, sub-surface flow, groundwater model

Procedia PDF Downloads 138
6134 Preparation of Polylactide Nanoparticles by Supercritical Fluid Technology

Authors: Jakub Zágora, Daniela Plachá, Karla Čech Barabaszová, Sylva Holešová, Roman Gábor, Alexandra Muñoz Bonilla, Marta Fernández García

Abstract:

The development of new antimicrobial materials that are not toxic to higher living organisms is a major challenge today. Newly developed materials can have high application potential in biomedicine, coatings, packaging, etc. A combination of commonly used biopolymer polylactide with cationic polymers seems to be very successful in the fight against antimicrobial resistance [1].PLA will play a key role in fulfilling the intention set out in the New Deal announced by the EU commission, as it is a bioplastic that is easily degradable, recyclable, and mass-produced. Also, the development of 3D printing in the context of this initiative, and the actual use of PLA as one of the main materials used for this printing, make the technology around the preparation and modification of PLA quite logical. Moreover, theenvironmentally friendly and energy saving technology like supercritical fluid process (SFP) will be used for their preparation. In a first approach, polylactide nano- and microparticles and structures were prepared by supercritical fluid extraction. The RESS (rapid expansion supercritical fluid solution) method is easier to optimize and shows better particle size control. On the contrary, a highly porous structure was obtained using the SAS (supercritical antisolvent) method. In a second part, the antimicrobial biobased polymer was introduced by SFP.

Keywords: polylactide, antimicrobial polymers, supercritical fluid technology, micronization

Procedia PDF Downloads 170