Search results for: data-driven approach
6409 The Theory of Domination at the Bane of Conflict Resolution and Peace Building Processes in Cameroon
Authors: Nkatow Mafany Christian
Abstract:
According to UNHCR’s annual Database, humanitarian crises have globally been on the increase since the beginning of the 21st Century, especially in the Middle East and in Sub-Saharan Africa. Cameroon is one of the countries that has suffered tremendously from humanitarian challenges in recent years, especially with crises in the Far North, the East and its Two English-speaking Regions. These have been a result of failed mechanisms in conflict resolution peacebuilding by the government. The paper draws from this basic premise to argue that the failure to reach a consensus in order to curb internal conflicts has largely been due to the government’s attachment to the domineering attitude which emphasizes an imposition of peace terms by a superordinate (government) agency on the subordinate (aggrieved) entities. This has stalled peace efforts that have so far been engaged to address the dreaded armed conflicts in the North and South West Regions, leading to the persistence of the armed conflict. The paper exploits written, oral and online sources to sustain its argument. It suggests that an eclectic approach to resolving conflicts, which emphasizes open and frank dialogue as well as a review of the root causes, can go a long way not only to build trust but also to address the Anglophone-Cameroonian problems in Cameroon.Keywords: conflict, conflict resolution, peace building, humanitarian crisis
Procedia PDF Downloads 646408 Providing a Suitable Model for Launching New Home Appliances Products to the Market
Authors: Ebrahim Sabermaash Eshghi, Donna Sandsmark
Abstract:
In changing modern economic conditions of the world, one the most important issues facing managers of firms, is increasing the sales and profitability through sales of newly developed products. This is while purpose of decreasing unnecessary costs is one of the most essential programs of smart managers for more implementation with new conditions in current business. In modern life, condition of misgiving is dominant in all of the industries. Accordingly, in this research, influence of different aspects of presenting products to the market is investigated. This study is done through a Quantitative-Qualitative (Interviews and Questionnaire) approach. In sum, 103 of informed managers and experts of Pars-Khazar Company have been examined through census. Validity of measurement tools was approved through judgments of experts. Reliability of tools was gained through Cronbach's alpha coefficient in size of 0.930 and in sum, validity and reliability of tools were approved generally. Results of regression test revealed that the influence of all aspects of product introduction supported the performance of product, positively and significantly. In addition that influence of two new factors raised from the interview, namely Human Resource Management and Management of product’s pre-test on performance of products was approved.Keywords: introducing products, performance, home appliances, price, advertisement, production
Procedia PDF Downloads 2096407 Characterizing and Developing the Clinical Grade Microbiome Assay with a Robust Bioinformatics Pipeline for Supporting Precision Medicine Driven Clinical Development
Authors: Danyi Wang, Andrew Schriefer, Dennis O'Rourke, Brajendra Kumar, Yang Liu, Fei Zhong, Juergen Scheuenpflug, Zheng Feng
Abstract:
Purpose: It has been recognized that the microbiome plays critical roles in disease pathogenesis, including cancer, autoimmune disease, and multiple sclerosis. To develop a clinical-grade assay for exploring microbiome-derived clinical biomarkers across disease areas, a two-phase approach is implemented. 1) Identification of the optimal sample preparation reagents using pre-mixed bacteria and healthy donor stool samples coupled with proprietary Sigma-Aldrich® bioinformatics solution. 2) Exploratory analysis of patient samples for enabling precision medicine. Study Procedure: In phase 1 study, we first compared the 16S sequencing results of two ATCC® microbiome standards (MSA 2002 and MSA 2003) across five different extraction kits (Kit A, B, C, D & E). Both microbiome standards samples were extracted in triplicate across all extraction kits. Following isolation, DNA quantity was determined by Qubit assay. DNA quality was assessed to determine purity and to confirm extracted DNA is of high molecular weight. Bacterial 16S ribosomal ribonucleic acid (rRNA) amplicons were generated via amplification of the V3/V4 hypervariable region of the 16S rRNA. Sequencing was performed using a 2x300 bp paired-end configuration on the Illumina MiSeq. Fastq files were analyzed using the Sigma-Aldrich® Microbiome Platform. The Microbiome Platform is a cloud-based service that offers best-in-class 16S-seq and WGS analysis pipelines and databases. The Platform and its methods have been extensively benchmarked using microbiome standards generated internally by MilliporeSigma and other external providers. Data Summary: The DNA yield using the extraction kit D and E is below the limit of detection (100 pg/µl) of Qubit assay as both extraction kits are intended for samples with low bacterial counts. The pre-mixed bacterial pellets at high concentrations with an input of 2 x106 cells for MSA-2002 and 1 x106 cells from MSA-2003 were not compatible with the kits. Among the remaining 3 extraction kits, kit A produced the greatest yield whereas kit B provided the least yield (Kit-A/MSA-2002: 174.25 ± 34.98; Kit-A/MSA-2003: 179.89 ± 30.18; Kit-B/MSA-2002: 27.86 ± 9.35; Kit-B/MSA-2003: 23.14 ± 6.39; Kit-C/MSA-2002: 55.19 ± 10.18; Kit-C/MSA-2003: 35.80 ± 11.41 (Mean ± SD)). Also, kit A produced the greatest yield, whereas kit B provided the least yield. The PCoA 3D visualization of the Weighted Unifrac beta diversity shows that kits A and C cluster closely together while kit B appears as an outlier. The kit A sequencing samples cluster more closely together than both the other kits. The taxonomic profiles of kit B have lower recall when compared to the known mixture profiles indicating that kit B was inefficient at detecting some of the bacteria. Conclusion: Our data demonstrated that the DNA extraction method impacts DNA concentration, purity, and microbial communities detected by next-generation sequencing analysis. Further microbiome analysis performance comparison of using healthy stool samples is underway; also, colorectal cancer patients' samples will be acquired for further explore the clinical utilities. Collectively, our comprehensive qualification approach, including the evaluation of optimal DNA extraction conditions, the inclusion of positive controls, and the implementation of a robust qualified bioinformatics pipeline, assures accurate characterization of the microbiota in a complex matrix for deciphering the deep biology and enabling precision medicine.Keywords: 16S rRNA sequencing, analytical validation, bioinformatics pipeline, metagenomics
Procedia PDF Downloads 1686406 Regional Flood-Duration-Frequency Models for Norway
Authors: Danielle M. Barna, Kolbjørn Engeland, Thordis Thorarinsdottir, Chong-Yu Xu
Abstract:
Design flood values give estimates of flood magnitude within a given return period and are essential to making adaptive decisions around land use planning, infrastructure design, and disaster mitigation. Often design flood values are needed at locations with insufficient data. Additionally, in hydrologic applications where flood retention is important (e.g., floodplain management and reservoir design), design flood values are required at different flood durations. A statistical approach to this problem is a development of a regression model for extremes where some of the parameters are dependent on flood duration in addition to being covariate-dependent. In hydrology, this is called a regional flood-duration-frequency (regional-QDF) model. Typically, the underlying statistical distribution is chosen to be the Generalized Extreme Value (GEV) distribution. However, as the support of the GEV distribution depends on both its parameters and the range of the data, special care must be taken with the development of the regional model. In particular, we find that the GEV is problematic when developing a GAMLSS-type analysis due to the difficulty of proposing a link function that is independent of the unknown parameters and the observed data. We discuss these challenges in the context of developing a regional QDF model for Norway.Keywords: design flood values, bayesian statistics, regression modeling of extremes, extreme value analysis, GEV
Procedia PDF Downloads 706405 High Order Block Implicit Multi-Step (Hobim) Methods for the Solution of Stiff Ordinary Differential Equations
Authors: J. P. Chollom, G. M. Kumleng, S. Longwap
Abstract:
The search for higher order A-stable linear multi-step methods has been the interest of many numerical analysts and has been realized through either higher derivatives of the solution or by inserting additional off step points, supper future points and the likes. These methods are suitable for the solution of stiff differential equations which exhibit characteristics that place a severe restriction on the choice of step size. It becomes necessary that only methods with large regions of absolute stability remain suitable for such equations. In this paper, high order block implicit multi-step methods of the hybrid form up to order twelve have been constructed using the multi-step collocation approach by inserting one or more off step points in the multi-step method. The accuracy and stability properties of the new methods are investigated and are shown to yield A-stable methods, a property desirable of methods suitable for the solution of stiff ODE’s. The new High Order Block Implicit Multistep methods used as block integrators are tested on stiff differential systems and the results reveal that the new methods are efficient and compete favourably with the state of the art Matlab ode23 code.Keywords: block linear multistep methods, high order, implicit, stiff differential equations
Procedia PDF Downloads 3576404 Stochastic Prioritization of Dependent Actuarial Risks: Preferences among Prospects
Authors: Ezgi Nevruz, Kasirga Yildirak, Ashis SenGupta
Abstract:
Comparing or ranking risks is the main motivating factor behind the human trait of making choices. Cumulative prospect theory (CPT) is a preference theory approach that evaluates perception and bias in decision making under risk and uncertainty. We aim to investigate the aggregate claims of different risk classes in terms of their comparability and amenability to ordering when the impact of risk perception is considered. For this aim, we prioritize the aggregate claims taken as actuarial risks by using various stochastic ordering relations. In order to prioritize actuarial risks, we use stochastic relations such as stochastic dominance and stop-loss dominance that are proposed in the frame of partial order theory. We take into account the dependency of the individual claims exposed to similar environmental risks. At first, we modify the zero-utility premium principle in order to obtain a solution for the stop-loss premium under CPT. Then, we propose a stochastic stop-loss dominance of the aggregate claims and find a relation between the stop-loss dominance and the first-order stochastic dominance under the dependence assumption by using properties of the familiar as well as some emerging multivariate claim distributions.Keywords: cumulative prospect theory, partial order theory, risk perception, stochastic dominance, stop-loss dominance
Procedia PDF Downloads 3196403 The Advancements of Transformer Models in Part-of-Speech Tagging System for Low-Resource Tigrinya Language
Authors: Shamm Kidane, Ibrahim Abdella, Fitsum Gaim, Simon Mulugeta, Sirak Asmerom, Natnael Ambasager, Yoel Ghebrihiwot
Abstract:
The call for natural language processing (NLP) systems for low-resource languages has become more apparent than ever in the past few years, with the arduous challenges still present in preparing such systems. This paper presents an improved dataset version of the Nagaoka Tigrinya Corpus for Parts-of-Speech (POS) classification system in the Tigrinya language. The size of the initial Nagaoka dataset was incremented, totaling the new tagged corpus to 118K tokens, which comprised the 12 basic POS annotations used previously. The additional content was also annotated manually in a stringent manner, followed similar rules to the former dataset and was formatted in CONLL format. The system made use of the novel approach in NLP tasks and use of the monolingually pre-trained TiELECTRA, TiBERT and TiRoBERTa transformer models. The highest achieved score is an impressive weighted F1-score of 94.2%, which surpassed the previous systems by a significant measure. The system will prove useful in the progress of NLP-related tasks for Tigrinya and similarly related low-resource languages with room for cross-referencing higher-resource languages.Keywords: Tigrinya POS corpus, TiBERT, TiRoBERTa, conditional random fields
Procedia PDF Downloads 1006402 Fuzzy and Fuzzy-PI Controller for Rotor Speed of Gas Turbine
Authors: Mandar Ghodekar, Sharad Jadhav, Sangram Jadhav
Abstract:
Speed control of rotor during startup and under varying load conditions is one of the most difficult tasks of gas turbine operation. In this paper, power plant gas turbine (GE9001E) is considered for this purpose and fuzzy and fuzzy-PI rotor speed controllers are designed. The goal of the presented controllers is to keep the turbine rotor speed within predefined limits during startup condition as well as during operating condition. The fuzzy controller and fuzzy-PI controller are designed using Takagi-Sugeno method and Mamdani method, respectively. In applying the fuzzy-PI control to a gas-turbine plant, the tuning parameters (Kp and Ki) are modified online by fuzzy logic approach. Error and rate of change of error are inputs and change in fuel flow is output for both the controllers. Hence, rotor speed of gas turbine is controlled by modifying the fuel flow. The identified linear ARX model of gas turbine is considered while designing the controllers. For simulations, demand power is taken as disturbance input. It is assumed that inlet guide vane (IGV) position is fixed. In addition, the constraint on the fuel flow is taken into account. The performance of the presented controllers is compared with each other as well as with H∞ robust and MPC controllers for the same operating conditions in simulations.Keywords: gas turbine, fuzzy controller, fuzzy PI controller, power plant
Procedia PDF Downloads 3336401 Two Efficient Heuristic Algorithms for the Integrated Production Planning and Warehouse Layout Problem
Authors: Mohammad Pourmohammadi Fallah, Maziar Salahi
Abstract:
In the literature, a mixed-integer linear programming model for the integrated production planning and warehouse layout problem is proposed. To solve the model, the authors proposed a Lagrangian relax-and-fix heuristic that takes a significant amount of time to stop with gaps above 5$\%$ for large-scale instances. Here, we present two heuristic algorithms to solve the problem. In the first one, we use a greedy approach by allocating warehouse locations with less reservation costs and also less transportation costs from the production area to locations and from locations to the output point to items with higher demands. Then a smaller model is solved. In the second heuristic, first, we sort items in descending order according to the fraction of the sum of the demands for that item in the time horizon plus the maximum demand for that item in the time horizon and the sum of all its demands in the time horizon. Then we categorize the sorted items into groups of 3, 4, or 5 and solve a small-scale optimization problem for each group, hoping to improve the solution of the first heuristic. Our preliminary numerical results show the effectiveness of the proposed heuristics.Keywords: capacitated lot-sizing, warehouse layout, mixed-integer linear programming, heuristics algorithm
Procedia PDF Downloads 1946400 Tangible Losses, Intangible Traumas: Re-envisioning Recovery Following the Lytton Creek Fire 2021 through Place Attachment Lens
Authors: Tugba Altin
Abstract:
In an era marked by pronounced climate change consequences, communities are observed to confront traumatic events that yield both tangible and intangible repercussions. Such events not only cause discernible damage to the landscape but also deeply affect the intangible aspects, including emotional distress and disruptions to cultural landscapes. The Lytton Creek Fire of 2021 serves as a case in point. Beyond the visible destruction, the less overt but profoundly impactful disturbance to place attachment (PA) is scrutinized. PA, representing the emotional and cognitive bonds individuals establish with their environments, is crucial for understanding how such events impact cultural identity and connection to the land. The study underscores the significance of addressing both tangible and intangible traumas for holistic community recovery. As communities renegotiate their affiliations with altered environments, the cultural landscape emerges as instrumental in shaping place-based identities. This renewed understanding is pivotal for reshaping adaptation planning. The research advocates for adaptation strategies rooted in the lived experiences and testimonies of the affected populations. By incorporating both the tangible and intangible facets of trauma, planning efforts are suggested to be more culturally attuned and emotionally insightful, fostering true resonance with the affected communities. Through such a comprehensive lens, this study contributes enriching the climate change discourse, emphasizing the intertwined nature of tangible recovery and the imperative of emotional and cultural healing after environmental disasters. Following the pronounced aftermath of the Lytton Creek Fire in 2021, research aims to deeply understand its impact on place attachment (PA), encompassing the emotional and cognitive bonds individuals form with their environments. The interpretive phenomenological approach, enriched by a hermeneutic framework, is adopted, emphasizing the experiences of the Lytton community and co-researchers. Phenomenology informed the understanding of 'place' as the focal point of attachment, providing insights into its formation and evolution after traumatic events. Data collection departs from conventional methods. Instead of traditional interviews, walking audio sessions and photo elicitation methods are utilized. These allow co-researchers to immerse themselves in the environment, re-experience, and articulate memories and feelings in real-time. Walking audio facilitates reflections on spatial narratives post-trauma, while photo voices captured intangible emotions, enabling the visualization of place-based experiences. The analysis is collaborative, ensuring the co-researchers' experiences and interpretations are central. Emphasizing their agency in knowledge production, the process is rigorous, facilitated by the harmonious blend of interpretive phenomenology and hermeneutic insights. The findings underscore the need for adaptation and recovery efforts to address emotional traumas alongside tangible damages. By exploring PA post-disaster, the research not only fills a significant gap but advocates for an inclusive approach to community recovery. Furthermore, the participatory methodologies employed challenge traditional research paradigms, heralding potential shifts in qualitative research norms.Keywords: wildfire recovery, place attachment, trauma recovery, cultural landscape, visual methodologies
Procedia PDF Downloads 886399 Probabilistic Model for Evaluating Seismic Soil Liquefaction Based on Energy Approach
Authors: Hamid Rostami, Ali Fallah Yeznabad, Mohammad H. Baziar
Abstract:
The energy-based method for evaluating seismic soil liquefaction has two main sections. First is the demand energy, which is dissipated energy of earthquake at a site, and second is the capacity energy as a representation of soil resistance against liquefaction hazard. In this study, using a statistical analysis of recorded data by 14 down-hole array sites in California, an empirical equation was developed to estimate the demand energy at sites. Because determination of capacity energy at a site needs to calculate several site calibration factors, which are obtained by experimental tests, in this study the standard penetration test (SPT) N-value was assumed as an alternative to the capacity energy at a site. Based on this assumption, the empirical equation was employed to calculate the demand energy for 193 liquefied and no-liquefied sites and then these amounts were plotted versus the corresponding SPT numbers for all sites. Subsequently, a discrimination analysis was employed to determine the equations of several boundary curves for various liquefaction likelihoods. Finally, a comparison was made between the probabilistic model and the commonly used stress method. As a conclusion, the results clearly showed that energy-based method can be more reliable than conventional stress-based method in evaluation of liquefaction occurrence.Keywords: energy demand, liquefaction, probabilistic analysis, SPT number
Procedia PDF Downloads 3666398 Framework to Quantify Customer Experience
Authors: Anant Sharma, Ashwin Rajan
Abstract:
Customer experience is measured today based on defining a set of metrics and KPIs, setting up thresholds and defining triggers across those thresholds. While this is an effective way of measuring against a Key Performance Indicator ( referred to as KPI in the rest of the paper ), this approach cannot capture the various nuances that make up the overall customer experience. Customers consume a product or service at various levels, which is not reflected in metrics like Customer Satisfaction or Net Promoter Score, but also across other measurements like recurring revenue, frequency of service usage, e-learning and depth of usage. Here we explore an alternative method of measuring customer experience by flipping the traditional views. Rather than rolling customers up to a metric, we roll up metrics to hierarchies and then measure customer experience. This method allows any team to quantify customer experience across multiple touchpoints in a customer’s journey. We make use of various data sources which contain information for metrics like CXSAT, NPS, Renewals, and depths of service usage collected across a customer lifecycle. This data can be mined systematically to get linkages between different data points like geographies, business groups, products and time. Additional views can be generated by blending synthetic contexts into the data to show trends and top/bottom types of reports. We have created a framework that allows us to measure customer experience using the above logic.Keywords: analytics, customers experience, BI, business operations, KPIs, metrics
Procedia PDF Downloads 706397 Analysis of Noodle Production Process at Yan Hu Food Manufacturing: Basis for Production Improvement
Authors: Rhadinia Tayag-Relanes, Felina C. Young
Abstract:
This study was conducted to analyze the noodle production process at Yan Hu Food Manufacturing for the basis of production improvement. The study utilized the PDCA approach and record review in the gathering of data for the calendar year 2019 from August to October data of the noodle products miki, canton, and misua. Causal-comparative research was used in this study; it attempts to establish cause-effect relationships among the variables such as descriptive statistics and correlation, both were used to compute the data gathered. The study found that miki, canton, and misua production has different cycle time sets for each production and has different production outputs in every set of its production process and a different number of wastages. The company has not yet established its allowable rejection rate/ wastage; instead, this paper used a 1% wastage limit. The researcher recommended the following: machines used for each process of the noodle product must be consistently maintained and monitored; an assessment of all the production operators by checking their performance statistically based on the output and the machine performance; a root cause analysis for finding the solution must be conducted; and an improvement on the recording system of the input and output of the production process of noodle product should be established to eliminate the poor recording of data.Keywords: continuous improvement, process, operations, PDCA
Procedia PDF Downloads 696396 Ilorin Traditional Architecture as a Good Example of a Green Building Design
Authors: Olutola Funmilayo Adekeye
Abstract:
Tradition African practice of architecture can be said to be deeply rooted in Green Architecture in concept, design and execution. A study into the ancient building techniques in Ilorin Emirate depicts prominent (eco-centric approach of) Green Architecture principles. In the Pre-colonial era before the introduction of modern architecture and Western building materials, the Nigeria traditional communities built their houses to meet their cultural, religious and social needs using mainly indigenous building materials such as mud (Amo), cowdung (Boto), straws (koriko), palm fronts (Imo-Ope) to mention a few. This research attempts to identify the various techniques of applying the traditional African principles of Green Architecture to Ilorin traditional buildings. It will examine and assess some case studies to understand the extent to which Green architecture principles have been applied to traditional building designs that are still preserved today in Ilorin, Nigeria. Furthermore, this study intends to answer many questions, which can be summarized into two basic questions which are: (1) What aspects of what today are recognized as important green architecture principles have been applied to Ilorin traditional buildings? (2) To what extent have the principles of green architecture applied to Ilorin traditional buildings been ways of demonstrating a cultural attachment to the earth as an expression of the African sense of human being as one with nature?Keywords: green architecture, Ilorin, traditional buildings, design principles, ecocentric, application
Procedia PDF Downloads 5456395 Challenges Caused by the Integration of Technology as a Pedagogy in One of the Historically Disadvantaged Higher Education Institutions
Authors: Rachel Gugu Mkhasibe
Abstract:
Incorporation of technology as a pedagogy has many benefits. For instance, improvement of pedagogy, increased information access, increased cooperation, and collaboration. However, as good as it may be, this integration of technology as a pedagogy has not been widely adopted in most historically Black higher education institutions especially those in developing countries. For example, the socioeconomic background of students in historically black universities, the weak financial support available from these universities, as well as a large population of students struggle to access the recommended modern physical resources such as iPads, laptops, mobile phones, to name a few. This contributes to an increase in the increase of educational inequalities. The qualitative research approach was utilized in this work to gather detailed data about the obstacles created by the integration of technology as a pedagogy. Interviews were conducted to generate data from 20 academics from 10 Leve two students from one of the historically disadvantaged higher education Institutions in South Africa. The findings revealed that although both students and academics had overwhelming support of the integration of technology as a pedagogy in their institution, the environment which they found themselves in compromise the incorporation of technology as a pedagogy. Therefore, this paper recommends that Department of Higher Education and University Management should intervene and budget for technology to be provided in all the institutions of higher education regardless of where the institutions are situated.Keywords: collaboration, integration, pedagogy, technology
Procedia PDF Downloads 806394 Contemplating Charge Transport by Modeling of DNA Nucleobases Based Nano Structures
Authors: Rajan Vohra, Ravinder Singh Sawhney, Kunwar Partap Singh
Abstract:
Electrical charge transport through two basic strands thymine and adenine of DNA have been investigated and analyzed using the jellium model approach. The FFT-2D computations have been performed for semi-empirical Extended Huckel Theory using atomistic tool kit to contemplate the charge transport metrics like current and conductance. The envisaged data is further evaluated in terms of transmission spectrum, HOMO-LUMO Gap and number of electrons. We have scrutinized the behavior of the devices in the range of -2V to 2V for a step size of 0.2V. We observe that both thymine and adenine can act as molecular devices when sandwiched between two gold probes. A prominent observation is a drop in HLGs of adenine and thymine when working as a device as compared to their intrinsic values and this is comparative more visible in case of adenine. The current in the thymine based device exhibit linear increase with voltage in spite of having low conductance. Further, the broader transmission peaks represent the strong coupling of electrodes to the scattering molecule (thymine). Moreover, the observed current in case of thymine is almost 3-4 times than that of observed for adenine. The NDR effect has been perceived in case of adenine based device for higher bias voltages and can be utilized in various future electronics applications.Keywords: adenine, DNA, extended Huckel, thymine, transmission spectra
Procedia PDF Downloads 1556393 Platooning Method Using Dynamic Correlation of Destination Vectors in Urban Areas
Authors: Yuya Tanigami, Naoaki Yamanaka, Satoru Okamoto
Abstract:
Economic losses due to delays in traffic congestion regarding urban transportation networks have become a more serious social problem as traffic volume increases. Platooning has recently been attracting attention from many researchers to alleviate traffic jams, especially on the highway. On highways, platooning can have positive effects, such as reducing inter-vehicular distance and reducing air resistance. However, the impacts of platooning on urban roads have not been addressed in detail since traffic lights may break the platoons. In this study, we propose a platooning method using L2 norm and cosine similarity to form a platoon with highly similar routes. Also, we investigate the sorting method within a platoon according to each vehicle’s straightness. Our proposed sorting platoon method, which uses two lanes, eliminates Head of Line Blocking at the intersection and improves throughput at intersections. This paper proposes a cyber-physical system (CPS) approach to collaborative urban platoon control. We conduct simulations using the traffic simulator SUMO and the road network, which imitates Manhattan Island. Results from the SUMO confirmed that our method shortens the average travel time by 10-20%. This paper shows the validity of forming a platoon based on destination vectors and sorting vehicles within a platoon.Keywords: CPS, platooning, connected car, vector correlation
Procedia PDF Downloads 746392 Air Quality Analysis Using Machine Learning Models Under Python Environment
Authors: Salahaeddine Sbai
Abstract:
Air quality analysis using machine learning models is a method employed to assess and predict air pollution levels. This approach leverages the capabilities of machine learning algorithms to analyze vast amounts of air quality data and extract valuable insights. By training these models on historical air quality data, they can learn patterns and relationships between various factors such as weather conditions, pollutant emissions, and geographical features. The trained models can then be used to predict air quality levels in real-time or forecast future pollution levels. This application of machine learning in air quality analysis enables policymakers, environmental agencies, and the general public to make informed decisions regarding health, environmental impact, and mitigation strategies. By understanding the factors influencing air quality, interventions can be implemented to reduce pollution levels, mitigate health risks, and enhance overall air quality management. Climate change is having significant impacts on Morocco, affecting various aspects of the country's environment, economy, and society. In this study, we use some machine learning models under python environment to predict and analysis air quality change over North of Morocco to evaluate the climate change impact on agriculture.Keywords: air quality, machine learning models, pollution, pollutant emissions
Procedia PDF Downloads 916391 Detection of Cyberattacks on the Metaverse Based on First-Order Logic
Authors: Sulaiman Al Amro
Abstract:
There are currently considerable challenges concerning data security and privacy, particularly in relation to modern technologies. This includes the virtual world known as the Metaverse, which consists of a virtual space that integrates various technologies and is therefore susceptible to cyber threats such as malware, phishing, and identity theft. This has led recent studies to propose the development of Metaverse forensic frameworks and the integration of advanced technologies, including machine learning for intrusion detection and security. In this context, the application of first-order logic offers a formal and systematic approach to defining the conditions of cyberattacks, thereby contributing to the development of effective detection mechanisms. In addition, formalizing the rules and patterns of cyber threats has the potential to enhance the overall security posture of the Metaverse and, thus, the integrity and safety of this virtual environment. The current paper focuses on the primary actions employed by avatars for potential attacks, including Interval Temporal Logic (ITL) and behavior-based detection to detect an avatar’s abnormal activities within the Metaverse. The research established that the proposed framework attained an accuracy of 92.307%, resulting in the experimental results demonstrating the efficacy of ITL, including its superior performance in addressing the threats posed by avatars within the Metaverse domain.Keywords: security, privacy, metaverse, cyberattacks, detection, first-order logic
Procedia PDF Downloads 396390 Identification of Disease Causing DNA Motifs in Human DNA Using Clustering Approach
Authors: G. Tamilpavai, C. Vishnuppriya
Abstract:
Studying DNA (deoxyribonucleic acid) sequence is useful in biological processes and it is applied in the fields such as diagnostic and forensic research. DNA is the hereditary information in human and almost all other organisms. It is passed to their generations. Earlier stage detection of defective DNA sequence may lead to many developments in the field of Bioinformatics. Nowadays various tedious techniques are used to identify defective DNA. The proposed work is to analyze and identify the cancer-causing DNA motif in a given sequence. Initially the human DNA sequence is separated as k-mers using k-mer separation rule. The separated k-mers are clustered using Self Organizing Map (SOM). Using Levenshtein distance measure, cancer associated DNA motif is identified from the k-mer clusters. Experimental results of this work indicate the presence or absence of cancer causing DNA motif. If the cancer associated DNA motif is found in DNA, it is declared as the cancer disease causing DNA sequence. Otherwise the input human DNA is declared as normal sequence. Finally, elapsed time is calculated for finding the presence of cancer causing DNA motif using clustering formation. It is compared with normal process of finding cancer causing DNA motif. Locating cancer associated motif is easier in cluster formation process than the other one. The proposed work will be an initiative aid for finding genetic disease related research.Keywords: bioinformatics, cancer motif, DNA, k-mers, Levenshtein distance, SOM
Procedia PDF Downloads 1866389 Document Analysis for Modelling iTV Advertising towards Impulse Purchase
Authors: Azizah Che Omar
Abstract:
The study provides a systematic literature review which analyzed the literature for the purpose of looking for concepts, theories, approaches and guidelines in order to propose a conceptual design model of interactive television advertising toward impulse purchase (iTVAdIP). An extensive review of literature was purposely carried out to understand the concepts of interactive television (iTV). Therefore, some elements; iTV guidelines, advertising theories, persuasive approaches, and the impulse purchase elements were analyzed to reach the scope of this work. The extensive review was also a necessity to achieve the objective of this study, which was to determine the concept of iTVAdIP design model. Through systematic review analysis, this study discovered that all the previous models did not emphasize the conceptual design model of interactive television advertising. As a result, the finding showed that the concept of the proposed model should contain the iTV guidelines, advertising theory, persuasive approach and impulse purchase elements. In addition, a summary diagram for the development of the proposed model is depicted to provide clearer understanding towards the concepts of conceptual design model of iTVAdIP.Keywords: impulse purchase, interactive television advertising, human computer interaction, advertising theories
Procedia PDF Downloads 3686388 Study of Sub-Surface Flow in an Unconfined Carbonate Aquifer in a Tropical Karst Area in Indonesia: A Modeling Approach Using Finite Difference Groundwater Model
Authors: Dua K. S. Y. Klaas, Monzur A. Imteaz, Ika Sudiayem, Elkan M. E. Klaas, Eldav C. M. Klaas
Abstract:
Due to its porous nature, karst terrains – geomorphologically developed from dissolved formations, is vulnerable to water shortage and deteriorated water quality. Therefore, a solid comprehension on sub-surface flow of karst landscape is essential to assess the long-term availability of groundwater resources. In this paper, a single-continuum model using a finite difference model, MODLFOW, was constructed to represent an unconfined carbonate aquifer in a tropical karst island of Rote in Indonesia. The model, spatially discretized in 20 x 20 m grid cells, was calibrated and validated using available groundwater level and atmospheric variables. In the calibration and validation steps, Parameter Estimation (PEST) and geostatistical pilot point methods were employed to estimate hydraulic conductivity and specific yield values. The results show that the model is able to represent the sub-surface flow indicated by good model performances both in calibration and validation steps. The final model can be used as a robust representation of the system for future study on climate and land use scenarios.Keywords: carbonate aquifer, karst, sub-surface flow, groundwater model
Procedia PDF Downloads 1476387 Preparation of Polylactide Nanoparticles by Supercritical Fluid Technology
Authors: Jakub Zágora, Daniela Plachá, Karla Čech Barabaszová, Sylva Holešová, Roman Gábor, Alexandra Muñoz Bonilla, Marta Fernández García
Abstract:
The development of new antimicrobial materials that are not toxic to higher living organisms is a major challenge today. Newly developed materials can have high application potential in biomedicine, coatings, packaging, etc. A combination of commonly used biopolymer polylactide with cationic polymers seems to be very successful in the fight against antimicrobial resistance [1].PLA will play a key role in fulfilling the intention set out in the New Deal announced by the EU commission, as it is a bioplastic that is easily degradable, recyclable, and mass-produced. Also, the development of 3D printing in the context of this initiative, and the actual use of PLA as one of the main materials used for this printing, make the technology around the preparation and modification of PLA quite logical. Moreover, theenvironmentally friendly and energy saving technology like supercritical fluid process (SFP) will be used for their preparation. In a first approach, polylactide nano- and microparticles and structures were prepared by supercritical fluid extraction. The RESS (rapid expansion supercritical fluid solution) method is easier to optimize and shows better particle size control. On the contrary, a highly porous structure was obtained using the SAS (supercritical antisolvent) method. In a second part, the antimicrobial biobased polymer was introduced by SFP.Keywords: polylactide, antimicrobial polymers, supercritical fluid technology, micronization
Procedia PDF Downloads 1866386 Perceptions of Educators on the Learners’ Youngest Age for the Introduction of ICTs in Schools: A Personality Theory Approach
Authors: Kayode E. Oyetade, Seraphin D. Eyono Obono
Abstract:
Age ratings are very helpful in providing parents with relevant information for the purchase and use of digital technologies by the children; this is why the non-definition of age ratings for the use of ICT's by children in schools is a major concern; and this problem serves as a motivation for this study whose aim is to examine the factors affecting the perceptions of educators on the learners’ youngest age for the introduction of ICT's in schools. This aim is achieved through two types of research objectives: the identification and design of theories and models on age ratings, and the empirical testing of such theories and models in a survey of educators from the Camperdown district of the South African KwaZulu-Natal province. A questionnaire is used for the collection of the data of this survey whose validity and reliability is checked in SPSS prior to its descriptive and correlative quantitative analysis. The main hypothesis supporting this research is the association between the demographics of educators, their personality, and their perceptions on the learners’ youngest age for the introduction of ICT's in schools; as claimed by existing research; except that the present study looks at personality from three dimensions: self-actualized personalities, fully functioning personalities, and healthy personalities. This hypothesis was fully confirmed by the empirical study conducted by this research except for the demographic factor where only the educators’ grade or class was found to be associated with the personality of educators.Keywords: age ratings, educators, e-learning, personality theories
Procedia PDF Downloads 2356385 Comparative Operating Speed and Speed Differential Day and Night Time Models for Two Lane Rural Highways
Authors: Vinayak Malaghan, Digvijay Pawar
Abstract:
Speed is the independent parameter which plays a vital role in the highway design. Design consistency of the highways is checked based on the variation in the operating speed. Often the design consistency fails to meet the driver’s expectation which results in the difference between operating and design speed. Literature reviews have shown that significant crashes take place in horizontal curves due to lack of design consistency. The paper focuses on continuous speed profile study on tangent to curve transition for both day and night daytime. Data is collected using GPS device which gives continuous speed profile and other parameters such as acceleration, deceleration were analyzed along with Tangent to Curve Transition. In this present study, models were developed to predict operating speed on tangents and horizontal curves as well as model indicating the speed reduction from tangent to curve based on continuous speed profile data. It is observed from the study that vehicle tends to decelerate from approach tangent to between beginning of the curve and midpoint of the curve and then accelerates from curve to tangent transition. The models generated were compared for both day and night and can be used in the road safety improvement by evaluating the geometric design consistency.Keywords: operating speed, design consistency, continuous speed profile data, day and night time
Procedia PDF Downloads 1566384 The Changing Face of Tourism-Making the Connection through Technological Advancement
Authors: Faduma Ahmed-Ali
Abstract:
The up and coming new generation of travelers will change how the world will achieve its global connectivity. The goal is that through people and technological advancement world-wide, people will be able to better explore the culture and beauty, as well as gain a better understanding of the core values of each host countries treasures. Through Rika's unique world connection model approach, the tourist can explore their destination with the help of local connections. Achieving a complete understanding of the host country while ensuring equal economic prosperity and cultural exchange is key to changing the face of tourism. A recent survey conducted by the author at Portland International Airport shows that over 50% of tourists entering Portland, Oregon are more eager to explore the city through local residents rather than an already planned itinerary created by travel companies. This new model, Rika, aims to shed light to the importance of connecting tourists with the technological tools that increase connectivity to the locals for a better travel experience and that fosters shared economic prosperity throughout a community achieving the goal of creating a sustainable, people driven economy.Keywords: RIKA, tourism, connection, technology, economic impact, sustainability, hospitality, strategies, tourism development, environment
Procedia PDF Downloads 2846383 Numerical Multi-Scale Modeling of Rubber Friction on Rough Pavements Using Finite Element Method
Authors: Ashkan Nazari, Saied Taheri
Abstract:
Knowledge of tire-pavement interaction plays a crucial role in designing safer and more reliable tires. Characterizing the tire-pavement frictional interaction leads to a better understanding of vehicle performance in braking and acceleration. In this work, we devise a multi-scale simulation approach to incorporate the effect of pavement surface asperities in different length-scales. We construct two- and three-dimensional Finite Element (FE) models to simulate the interaction between a rubber block and a rough pavement surface with asperities in different scales. To achieve this, the road profile is scanned via a laser profilometer and the obtained asperities are implemented in an FE software (ABAQUS) in micro and macro length-scales. The hysteresis friction, which is due to the dissipative nature of rubber, is the main component of the friction force and therefore is the subject of study in this work. Using different scales not only will assist in characterizing the pavement asperities with sufficient details but also, it is highly effective in preventing extreme local deformations and stress gradients which results in divergence in FE simulations. The simulation results will be validated with experimental results as well as the results reported in the literature.Keywords: friction, finite element, multi-scale modeling, rubber
Procedia PDF Downloads 1356382 Barriers Facing the Implementation of Lean Manufacturing in Libyan Manufacturing Companies
Authors: Mohamed Abduelmula, Martin Birkett, Chris Connor
Abstract:
Lean Manufacturing has developed from being a set of tools and methods to becoming a management philosophy which can be used to remove or reduce waste in manufacturing processes and so enhance the operational productivity of an enterprise. Several enterprises around the world have applied the lean manufacturing system and gained great improvements. This paper investigates the barriers and obstacles that face Libyan manufacturing companies to implement lean manufacturing. A mixed-method approach is suggested, starting with conducting a questionnaire to get quantitative data then using this to develop semi-structured interviews to collect qualitative data. The findings of the questionnaire results and how these can be used further develop the semi-structured interviews are then discussed. The survey was distributed to 65 manufacturing companies in Libya, and a response rate of 64.6% was obtained. The results showed that these are five main barriers to implementing lean in Libya, namely organizational culture, skills and expertise, and training program, financial capability, top management, and communication. These barriers were also identified from the literature as being significant obstacles to implementing Lean in other countries industries. Having an understanding of the difficulties that face the implementation of lean manufacturing systems, as a new and modern system and using this to develop a suitable framework will help to improve the manufacturing sector in Libya.Keywords: lean manufacturing, barriers, questionnaire, Libyan manufacturing companies
Procedia PDF Downloads 2456381 Political Discourse and Linguistic Manipulation in Nigerian Politics
Authors: Kunle Oparinde, Ernestina Maleshoane Rapeane-Mathonsi, Gift Mheta
Abstract:
Using Critical Discourse Analysis (CDA) and Multimodal Discourse Analysis (MDA), the research seeks to deconstruct politically-motivated discourse as observed from Nigerian politics. This is intended to be achieved by analysing linguistic (mis)representation and manipulation in Nigerian political settings, drawing from instances of language use as observed from different political campaigns. Since language in itself is generally meaningless without context, it is therefore paramount to analyse the (mis)representation and manipulation in Nigerian political sceneries within their contextual basis. The study focuses on political language used by Nigerian politicians emanating from printed and social media forms such as posters, pamphlets, speeches, billboards, and internet sources purposely selected across Nigeria. The research further aims at investigating the discursive strategies used by politicians to gain more audience, and, as a result, shape opinions that result in votes. The study employs a qualitative approach. Two parties are intentionally selected because they have been essentially strong at the national level namely: All Progressive Congress (APC) and the People’s Democratic Party (PDP). The study finds out that politicians in Nigeria, as in many parts of the world, use language to manipulate the electorate. Comprehensive discussion of these instances of political manipulation remains the thrust of this paper.Keywords: communication, discourse, manipulation, misrepresentation
Procedia PDF Downloads 2496380 A Multicenter Assessment on Psychological Well-Being Status among Medical Residents in the United Arab Emirates
Authors: Mahera Abdulrahman
Abstract:
Objective: Healthcare transformation from traditional to modern in the country recently prompted the need to address career choices, accreditation perception and satisfaction among medical residents. However, a concerted nationwide study to understand and address burnout in the medical residency program has not been conducted in the UAE and the region. Methods: A nationwide, multicenter, cross-sectional study was designed to evaluate professional burnout and depression among medical residents in order to address the gap. Results: Our results indicate that 75.5% (216/286) of UAE medical residents had moderate to high emotional exhaustion, 84% (249/298) had high depersonalization, and 74% (216/291) had a low sense of personal accomplishment. In aggregate, 70% (212/302) of medical residents were considered to be experiencing at least one symptom of burnout based on a high emotional exhaustion score or a high depersonalization score. Depression ranging from 6-22%, depending on the specialty was also striking given the fact the Arab culture lays high emphasis on family bonding. Interestingly 83% (40/48) of medical residents who had high scores for depression also reported burnout. Conclusion: Our data indicate that burnout and depression among medical residents is epidemic. There is an immediate need to address burnout through effective interventions at both the individual and institutional levels. It is imperative to reconfigure the approach to medical training for the well-being of the next generation of physicians in the Arab world.Keywords: mental health, Gulf, Arab, residency training, burnout, depression
Procedia PDF Downloads 294