Search results for: dynamical analysis
26961 Understanding the Semantic Network of Tourism Studies in Taiwan by Using Bibliometrics Analysis
Authors: Chun-Min Lin, Yuh-Jen Wu, Ching-Ting Chung
Abstract:
The formulation of tourism policies requires objective academic research and evidence as support, especially research from local academia. Taiwan is a small island, and its economic growth relies heavily on tourism revenue. Taiwanese government has been devoting to the promotion of the tourism industry over the past few decades. Scientific research outcomes by Taiwanese scholars may and will help lay the foundations for drafting future tourism policy by the government. In this study, a total of 120 full journal articles published between 2008 and 2016 from the Journal of Tourism and Leisure Studies (JTSL) were examined to explore the scientific research trend of tourism study in Taiwan. JTSL is one of the most important Taiwanese journals in the tourism discipline which focuses on tourism-related issues and uses traditional Chinese as the study language. The method of co-word analysis from bibliometrics approaches was employed for semantic analysis in this study. When analyzing Chinese words and phrases, word segmentation analysis is a crucial step. It must be carried out initially and precisely in order to obtain meaningful word or word chunks for further frequency calculation. A word segmentation system basing on N-gram algorithm was developed in this study to conduct semantic analysis, and 100 groups of meaningful phrases with the highest recurrent rates were located. Subsequently, co-word analysis was employed for semantic classification. The results showed that the themes of tourism research in Taiwan in recent years cover the scope of tourism education, environmental protection, hotel management, information technology, and senior tourism. The results can give insight on the related issues and serve as a reference for tourism-related policy making and follow-up research.Keywords: bibliometrics, co-word analysis, word segmentation, tourism research, policy
Procedia PDF Downloads 22926960 NOx Emission and Computational Analysis of Jatropha Curcus Fuel and Crude Oil
Authors: Vipan Kumar Sohpal, Rajesh K Sharma
Abstract:
Diminishing of conventional fuels and hysterical vehicles emission leads to deterioration of the environment, which emphasize the research to work on biofuels. Biofuels from different sources attract the attention of research due to low emission and biodegradability. Emission of carbon monoxide, carbon dioxide and H-C reduced drastically using Biofuels (B-20) combustion. Contrary to the conventional fuel, engine emission results indicated that nitrous oxide emission is higher in Biofuels. So this paper examines and compares the nitrogen oxide emission of Jatropha Curcus (JCO) B-20% blends with the vegetable oil. In addition to that computational analysis of crude non edible oil performed to assess the impact of composition on emission quality. In conclusion, JCO have the potential feedstock for the biodiesel production after the genetic modification in the plant.Keywords: jatropha curcus, computational analysis, emissions, NOx biofuels
Procedia PDF Downloads 58726959 Understanding Mathematics Achievements among U. S. Middle School Students: A Bayesian Multilevel Modeling Analysis with Informative Priors
Authors: Jing Yuan, Hongwei Yang
Abstract:
This paper aims to understand U.S. middle school students’ mathematics achievements by examining relevant student and school-level predictors. Through a variance component analysis, the study first identifies evidence supporting the use of multilevel modeling. Then, a multilevel analysis is performed under Bayesian statistical inference where prior information is incorporated into the modeling process. During the analysis, independent variables are entered sequentially in the order of theoretical importance to create a hierarchy of models. By evaluating each model using Bayesian fit indices, a best-fit and most parsimonious model is selected where Bayesian statistical inference is performed for the purpose of result interpretation and discussion. The primary dataset for Bayesian modeling is derived from the Program for International Student Assessment (PISA) in 2012 with a secondary PISA dataset from 2003 analyzed under the traditional ordinary least squares method to provide the information needed to specify informative priors for a subset of the model parameters. The dependent variable is a composite measure of mathematics literacy, calculated from an exploratory factor analysis of all five PISA 2012 mathematics achievement plausible values for which multiple evidences are found supporting data unidimensionality. The independent variables include demographics variables and content-specific variables: mathematics efficacy, teacher-student ratio, proportion of girls in the school, etc. Finally, the entire analysis is performed using the MCMCpack and MCMCglmm packages in R.Keywords: Bayesian multilevel modeling, mathematics education, PISA, multilevel
Procedia PDF Downloads 33626958 Copula-Based Estimation of Direct and Indirect Effects in Path Analysis Models
Authors: Alam Ali, Ashok Kumar Pathak
Abstract:
Path analysis is a statistical technique used to evaluate the direct and indirect effects of variables in path models. One or more structural regression equations are used to estimate a series of parameters in path models to find the better fit of data. However, sometimes the assumptions of classical regression models, such as ordinary least squares (OLS), are violated by the nature of the data, resulting in insignificant direct and indirect effects of exogenous variables. This article aims to explore the effectiveness of a copula-based regression approach as an alternative to classical regression, specifically when variables are linked through an elliptical copula.Keywords: path analysis, copula-based regression models, direct and indirect effects, k-fold cross validation technique
Procedia PDF Downloads 4126957 The Analysis of a Reactive Hydromagnetic Internal Heat Generating Poiseuille Fluid Flow through a Channel
Authors: Anthony R. Hassan, Jacob A. Gbadeyan
Abstract:
In this paper, the analysis of a reactive hydromagnetic Poiseuille fluid flow under each of sensitized, Arrhenius and bimolecular chemical kinetics through a channel in the presence of heat source is carried out. An exothermic reaction is assumed while the concentration of the material is neglected. Adomian Decomposition Method (ADM) together with Pade Approximation is used to obtain the solutions of the governing nonlinear non – dimensional differential equations. Effects of various physical parameters on the velocity and temperature fields of the fluid flow are investigated. The entropy generation analysis and the conditions for thermal criticality are also presented.Keywords: chemical kinetics, entropy generation, thermal criticality, adomian decomposition method (ADM) and pade approximation
Procedia PDF Downloads 46426956 Mitigation of Size Effects in Woven Fabric Composites Using Finite Element Analysis Approach
Authors: Azeez Shaik, Yagnik Kalariya, Amit Salvi
Abstract:
High-performance requirements and emission norms were forcing the automobile industry to opt for lightweight materials which improve the fuel efficiency and absorb energy during crash applications. In such scenario, the woven fabric composites are providing better energy absorption compared to metals. Woven fabric composites have a repetitive unit cell (RUC) and the mechanical properties of these materials are highly dependent on RUC. This work investigates the importance of detailed modelling of the RUC, the size effects associated and the mitigation techniques to avoid them using Finite element analysis approach.Keywords: repetitive unit cell, representative volume element, size effects, cohesive zone, finite element analysis
Procedia PDF Downloads 25526955 The Moderation Effect of Critical Item on the Strategic Purchasing: Quality Performance Relationship
Authors: Kwong Yeung
Abstract:
Theories about strategic purchasing and quality performance are underdeveloped. Understanding the evolving role of purchasing from reactive to proactive is a pressing strategic issue. Using survey responses from 176 manufacturing and electronics industry professionals, we study the relationships between strategic purchasing and supply chain partners’ quality performance to answer the following questions: Can transaction cost economics be used to elucidate the strategic purchasing-quality performance relationship? Is this strategic purchasing-quality performance relationship moderated by critical item analysis? The findings indicate that critical item analysis positively and significantly moderates the strategic purchasing-quality performance relationship.Keywords: critical item analysis, moderation, quality performance, strategic purchasing, transaction cost economics
Procedia PDF Downloads 56326954 Coordinated Voltage Control in a Radial Distribution System
Authors: Shivarudraswamy, Anubhav Shrivastava, Lakshya Bhat
Abstract:
Distributed generation has indeed become a major area of interest in recent years. Distributed Generation can address large number of loads in a power line and hence has better efficiency over the conventional methods. However there are certain drawbacks associated with it, increase in voltage being the major one. This paper addresses the voltage control at the buses for an IEEE 30 bus system by regulating reactive power. For carrying out the analysis, the suitable location for placing distributed generators (DG) is identified through load flow analysis and seeing where the voltage profile is dipping. MATLAB programming is used to regulate the voltage at all buses within +/-5% of the base value even after the introduction of DG’s. Three methods for regulation of voltage are discussed. A sensitivity based analysis is later carried out to determine the priority among the various methods listed in the paper.Keywords: distributed generators, distributed system, reactive power, voltage control
Procedia PDF Downloads 50026953 An Analysis of Discourse Markers Awareness in Writing Undergraduate Thesis of English Education Student in Sebelas Maret University
Authors: Oktanika Wahyu Nurjanah, Anggun Fitriana Dewi
Abstract:
An undergraduate thesis is one of the academic writings which should fulfill some characteristics, one of them is coherency. Moreover, a coherence of a text depends on the usage of discourse markers. In other word, discourse markers take an essential role in writing. Therefore, the researchers aim to know the awareness of the discourse markers usage in writing the under-graduate thesis of an English Education student at Sebelas Maret University. This research uses a qualitative case study in order to obtain a deep analysis. The sample of this research is an under-graduate thesis of English Education student in Sebelas Maret University which chosen based on some criteria. Additionally, the researchers were guided by some literature attempted to group the discourse markers based on their functions. Afterward, the analysis was held based on it. From the analysis, it found that the awareness of discourse markers usage is moderate. The last point, the researcher suggest undergraduate students to familiarize themselves with discourse markers, especially for those who want to write thesis.Keywords: discourse markers, English education, thesis writing, undergraduate student
Procedia PDF Downloads 35726952 Resource Constrained Time-Cost Trade-Off Analysis in Construction Project Planning and Control
Authors: Sangwon Han, Chengquan Jin
Abstract:
Time-cost trade-off (TCTO) is one of the most significant part of construction project management. Despite the significance, current TCTO analysis, based on the Critical Path Method, does not consider resource constraint, and accordingly sometimes generates an impractical and/or infeasible schedule planning in terms of resource availability. Therefore, resource constraint needs to be considered when doing TCTO analysis. In this research, genetic algorithms (GA) based optimization model is created in order to find the optimal schedule. This model is utilized to compare four distinct scenarios (i.e., 1) initial CPM, 2) TCTO without considering resource constraint, 3) resource allocation after TCTO, and 4) TCTO with considering resource constraint) in terms of duration, cost, and resource utilization. The comparison results identify that ‘TCTO with considering resource constraint’ generates the optimal schedule with the respect of duration, cost, and resource. This verifies the need for consideration of resource constraint when doing TCTO analysis. It is expected that the proposed model will produce more feasible and optimal schedule.Keywords: time-cost trade-off, genetic algorithms, critical path, resource availability
Procedia PDF Downloads 18726951 Longitudinal Analysis of Internet Speed Data in the Gulf Cooperation Council Region
Authors: Musab Isah
Abstract:
This paper presents a longitudinal analysis of Internet speed data in the Gulf Cooperation Council (GCC) region, focusing on the most populous cities of each of the six countries – Riyadh, Saudi Arabia; Dubai, UAE; Kuwait City, Kuwait; Doha, Qatar; Manama, Bahrain; and Muscat, Oman. The study utilizes data collected from the Measurement Lab (M-Lab) infrastructure over a five-year period from January 1, 2019, to December 31, 2023. The analysis includes downstream and upstream throughput data for the cities, covering significant events such as the launch of 5G networks in 2019, COVID-19-induced lockdowns in 2020 and 2021, and the subsequent recovery period and return to normalcy. The results showcase substantial increases in Internet speeds across the cities, highlighting improvements in both download and upload throughput over the years. All the GCC countries have achieved above-average Internet speeds that can conveniently support various online activities and applications with excellent user experience.Keywords: internet data science, internet performance measurement, throughput analysis, internet speed, measurement lab, network diagnostic tool
Procedia PDF Downloads 6226950 Integrating Data Envelopment Analysis and Variance Inflation Factor to Measure the Efficiency of Decision Making Units
Authors: Mostafa Kazemi, Zahra N. Farkhani
Abstract:
This paper proposes an integrated Data Envelopment Analysis (DEA) and Variance Inflation Factor (VIF) model for measuring the technical efficiency of decision making units. The model is validated using a set of 69% sales representatives’ dairy products. The analysis is done in two stages, in the first stage, VIF technique is used to distinguish independent effective factors of resellers, and in the second stage we used DEA for measuring efficiency for both constant and variable return to scales status. Further DEA is used to examine the utilization of environmental factors on efficiency. Results of this paper indicated an average managerial efficiency of 83% in the whole sales representatives’ dairy products. In addition, technical and scale efficiency were counted 96% and 80% respectively. 38% of sales representative have the technical efficiency of 100% and 72% of the sales representative in terms of managerial efficiency are quite efficient.High levels of relative efficiency indicate a good condition for sales representative efficiency.Keywords: data envelopment analysis (DEA), relative efficiency, sales representatives’ dairy products, variance inflation factor (VIF)
Procedia PDF Downloads 56826949 Effects of Small Amount of Poly(D-Lactic Acid) on the Properties of Poly(L-Lactic Acid)/Microcrystalline Cellulose/Poly(D-Lactic Acid) Blends
Authors: Md. Hafezur Rahaman, Md. Sagor Hosen, Md. Abdul Gafur, Rasel Habib
Abstract:
This research is a systematic study of effects of poly(D-lactic acid) (PDLA) on the properties of poly(L-lactic acid)(PLLA)/microcrystalline cellulose (MCC)/PDLA blends by stereo complex crystallization. Blends were prepared with constant percentage of (3 percent) MCC and different percentage of PDLA by solution casting methods. These blends were characterized by Fourier Transform Infrared Spectroscopy (FTIR) for the confirmation of blends compatibility, Wide-Angle X-ray Scattering (WAXS) and scanning electron microscope (SEM) for the analysis of morphology, thermo-gravimetric analysis (TGA) and differential thermal analysis (DTA) for thermal properties measurement. FTIR Analysis results confirm no new characteristic absorption peaks appeared in the spectrum instead shifting of peaks due to hydrogen bonding help to have compatibility of blends component. Development of three new peaks from XRD analysis indicates strongly the formation of stereo complex crystallinity in the PLLA structure with the addition of PDLA. TGA and DTG results indicate that PDLA can improve the heat resistivity of the PLLA/MCC blends by increasing its degradation temperature. Comparison of DTA peaks also ensure developed thermal properties. Image of SEM shows the improvement of surface morphology.Keywords: microcrystalline cellulose, poly(l-lactic acid), stereocomplex crystallization, thermal stability
Procedia PDF Downloads 13526948 Identify the Renewable Energy Potential through Sustainability Indicators and Multicriteria Analysis
Authors: Camila Lima, Murilo Andrade Valle, Patrícia Teixeira Leite Asano
Abstract:
The growth in demand for electricity, caused by human development, depletion and environmental impacts caused by traditional sources of electricity generation have made new energy sources are increasingly encouraged and necessary for companies in the electricity sector. Based on this scenario, this paper assesses the negative environmental impacts associated with thermoelectric power plants in Brazil, pointing out the importance of using renewable energy sources, reducing environmental aggression. This article points out the existence of an energy alternative, wind energy, of the municipalities of São Paulo, represented by georeferenced maps with the help of GIS, using as a premise the indicators of sustainability and multicriteria analysis in the decision-making process.Keywords: GIS (geographic information systems), multicriteria analysis, sustainability, wind energy
Procedia PDF Downloads 36526947 The Non-Linear Analysis of Brain Response to Visual Stimuli
Authors: H. Namazi, H. T. N. Kuan
Abstract:
Brain activity can be measured by acquiring and analyzing EEG signals from an individual. In fact, the human brain response to external and internal stimuli is mapped in his EEG signals. During years some methods such as Fourier transform, wavelet transform, empirical mode decomposition, etc. have been used to analyze the EEG signals in order to find the effect of stimuli, especially external stimuli. But each of these methods has some weak points in analysis of EEG signals. For instance, Fourier transform and wavelet transform methods are linear signal analysis methods which are not good to be used for analysis of EEG signals as nonlinear signals. In this research we analyze the brain response to visual stimuli by extracting information in the form of various measures from EEG signals using a software developed by our research group. The used measures are Jeffrey’s measure, Fractal dimension and Hurst exponent. The results of these analyses are useful not only for fundamental understanding of brain response to visual stimuli but provide us with very good recommendations for clinical purposes.Keywords: visual stimuli, brain response, EEG signal, fractal dimension, hurst exponent, Jeffrey’s measure
Procedia PDF Downloads 56126946 Hydrology and Hydraulics Analysis of Beko Abo Dam and Appurtenant Structre Design, Ethiopia
Authors: Azazhu Wassie
Abstract:
This study tried to evaluate the maximum design flood for appurtenance structure design using the given climatological and hydrological data analysis on the referenced study area. The maximum design flood is determined by using flood frequency analysis. Using this method, the peak discharge is 32,583.67 m3/s, but the data is transferred because the dam site is not on the gauged station. Then the peak discharge becomes 38,115 m3/s. The study was conducted in June 2023. This dam is built across a river to create a reservoir on its upstream side for impounding water. The water stored in the reservoir is used for various purposes, such as irrigation, hydropower, navigation, fishing, etc. The total average volume of annual runoff is estimated to be 115.1 billion m3. The total potential of the land for irrigation development can go beyond 3 million ha.Keywords: dam design, flow duration curve, peak flood, rainfall, reservoir capacity, risk and reliability
Procedia PDF Downloads 2626945 Study of Mobile Game Addiction Using Electroencephalography Data Analysis
Authors: Arsalan Ansari, Muhammad Dawood Idrees, Maria Hafeez
Abstract:
Use of mobile phones has been increasing considerably over the past decade. Currently, it is one of the main sources of communication and information. Initially, mobile phones were limited to calls and messages, but with the advent of new technology smart phones were being used for many other purposes including video games. Despite of positive outcomes, addiction to video games on mobile phone has become a leading cause of psychological and physiological problems among many people. Several researchers examined the different aspects of behavior addiction with the use of different scales. Objective of this study is to examine any distinction between mobile game addicted and non-addicted players with the use of electroencephalography (EEG), based upon psycho-physiological indicators. The mobile players were asked to play a mobile game and EEG signals were recorded by BIOPAC equipment with AcqKnowledge as data acquisition software. Electrodes were places, following the 10-20 system. EEG was recorded at sampling rate of 200 samples/sec (12,000samples/min). EEG recordings were obtained from the frontal (Fp1, Fp2), parietal (P3, P4), and occipital (O1, O2) lobes of the brain. The frontal lobe is associated with behavioral control, personality, and emotions. The parietal lobe is involved in perception, understanding logic, and arithmetic. The occipital lobe plays a role in visual tasks. For this study, a 60 second time window was chosen for analysis. Preliminary analysis of the signals was carried out with Acqknowledge software of BIOPAC Systems. From the survey based on CGS manual study 2010, it was concluded that five participants out of fifteen were in addictive category. This was used as prior information to group the addicted and non-addicted by physiological analysis. Statistical analysis showed that by applying clustering analysis technique authors were able to categorize the addicted and non-addicted players specifically on theta frequency range of occipital area.Keywords: mobile game, addiction, psycho-physiology, EEG analysis
Procedia PDF Downloads 16426944 The Relationship between Sleep Traits and Tinnitus in UK Biobank: A Population-Based Cohort Study
Authors: Jiajia Peng, Yijun Dong, Jianjun Ren, Yu Zhao
Abstract:
Objectives: Understanding the association between sleep traits and tinnitus could help prevent and provide appropriate interventions against tinnitus. Therefore, this study aimed to assess the relationship between different sleep patterns and tinnitus. Design: A cross-sectional analysis using baseline data (2006–2010, n=168,064) by logistic regressions was conducted to evaluate the association between sleep traits (including the overall health sleep score and five sleep behaviors), and the occurrence (yes/no), frequency (constant/transient), and severity (upsetting/not upsetting) of tinnitus. Further, a prospective analysis of participants without tinnitus at baseline (n=9,581) was performed, who had been followed up for seven years (2012–2019) to assess the association between new-onset tinnitus and sleep characteristics. Moreover, a subgroup analysis was also carried out to estimate the differences in sex by dividing the participants into male and female groups. A sensitivity analysis was also conducted by excluding ear-related diseases to avoid their confounding effects on tinnitus (n=102,159). Results: In the cross-sectional analysis, participants with “current tinnitus” (OR: 1.13, 95% CI: 1.04–1.22, p=0.004) had a higher risk of having a poor overall healthy sleep score and unhealthy sleep behaviors such as short sleep durations (OR: 1.09, 95% CI: 1.04–1.14, p<0.001), late chronotypes (OR: 1.09, 95% CI: 1.05–1.13, p<0.001), and sleeplessness (OR: 1.16, 95% CI: 1.11–1.22, p<0.001) than those participants who “did not have current tinnitus.” However, this trend was not obvious between “constant tinnitus” and “transient tinnitus.” When considering the severity of tinnitus, the risk of “upsetting tinnitus” was obviously higher if participants had lower overall healthy sleep scores (OR: 1.31, 95% CI: 1.13–1.53, p<0.001). Additionally, short sleep duration (OR: 1.22, 95% CI: 1.12–1.33, p<0.001), late chronotypes (OR: 1.13, 95% CI: 1.04–1.22, p=0.003), and sleeplessness (OR: 1.43, 95% CI: 1.29–1.59, p<0.001) showed positive correlations with “upsetting tinnitus.” In the prospective analysis, sleeplessness presented a consistently significant association with “upsetting tinnitus” (RR: 2.28, P=0.001). Consistent results were observed in the sex subgroup analysis, where a much more pronounced trend was identified in females compared with males. The results of the sensitivity analysis were consistent with those of the cross-sectional and prospective analyses. Conclusions: Different types of sleep disturbance may be associated with the occurrence and severity of tinnitus; therefore, precise interventions for different types of sleep disturbance, particularly sleeplessness, may help in the prevention and treatment of tinnitus.Keywords: tinnitus, sleep, sleep behaviors, sleep disturbance
Procedia PDF Downloads 14226943 Post Pandemic Mobility Analysis through Indexing and Sharding in MongoDB: Performance Optimization and Insights
Authors: Karan Vishavjit, Aakash Lakra, Shafaq Khan
Abstract:
The COVID-19 pandemic has pushed healthcare professionals to use big data analytics as a vital tool for tracking and evaluating the effects of contagious viruses. To effectively analyze huge datasets, efficient NoSQL databases are needed. The analysis of post-COVID-19 health and well-being outcomes and the evaluation of the effectiveness of government efforts during the pandemic is made possible by this research’s integration of several datasets, which cuts down on query processing time and creates predictive visual artifacts. We recommend applying sharding and indexing technologies to improve query effectiveness and scalability as the dataset expands. Effective data retrieval and analysis are made possible by spreading the datasets into a sharded database and doing indexing on individual shards. Analysis of connections between governmental activities, poverty levels, and post-pandemic well being is the key goal. We want to evaluate the effectiveness of governmental initiatives to improve health and lower poverty levels. We will do this by utilising advanced data analysis and visualisations. The findings provide relevant data that supports the advancement of UN sustainable objectives, future pandemic preparation, and evidence-based decision-making. This study shows how Big Data and NoSQL databases may be used to address problems with global health.Keywords: big data, COVID-19, health, indexing, NoSQL, sharding, scalability, well being
Procedia PDF Downloads 7026942 Ergonomical Study of Hand-Arm Vibrational Exposure in a Gear Manufacturing Plant in India
Authors: Santosh Kumar, M. Muralidhar
Abstract:
The term ‘ergonomics’ is derived from two Greek words: ‘ergon’, meaning work and ‘nomoi’, meaning natural laws. Ergonomics is the study of how working conditions, machines and equipment can be arranged in order that people can work with them more efficiently. In this research communication an attempt has been made to study the effect of hand-arm vibrational exposure on the workers of a gear manufacturing plant by comparison of potential Carpal Tunnel Syndrome (CTS) symptoms and effect of different exposure levels of vibration on occurrence of CTS in actual industrial environment. Chi square test and correlation analysis have been considered for statistical analysis. From Chi square test, it has been found that the potential CTS symptoms occurrence is significantly dependent on the level of vibrational exposure. Data analysis indicates that 40.51% workers having potential CTS symptoms are exposed to vibration. Correlation analysis reveals that potential CTS symptoms are significantly correlated with exposure to level of vibration from handheld tools and to repetitive wrist movements.Keywords: CTS symptoms, hand-arm vibration, ergonomics, physical tests
Procedia PDF Downloads 37126941 Holomorphic Prioritization of Sets within Decagram of Strategic Decision Making of POSM Using Operational Research (OR): Analytic Hierarchy Process (AHP) Analysis
Authors: Elias Ogutu Azariah Tembe, Hussain Abdullah Habib Al-Salamin
Abstract:
There is decagram of strategic decisions of operations and production/service management (POSM) within operational research (OR) which must collate, namely: design, inventory, quality, location, process and capacity, layout, scheduling, maintain ace, and supply chain. This paper presents an architectural configuration conceptual framework of a decagram of sets decisions in a form of mathematical complete graph and abelian graph. Mathematically, a complete graph is undirected (UDG), and directed (DG) a relationship where every pair of vertices are connected, collated, confluent, and holomorphic. There has not been any study conducted which, however, prioritizes the holomorphic sets which of POMS within OR field of study. The study utilizes OR structured technique known as The Analytic Hierarchy Process (AHP) analysis for organizing, sorting and prioritizing (ranking) the sets within the decagram of POMS according to their attribution (propensity), and provides an analysis how the prioritization has real-world application within the 21st century.Keywords: holomorphic, decagram, decagon, confluent, complete graph, AHP analysis, SCM, HRM, OR, OM, abelian graph
Procedia PDF Downloads 40226940 The Use of Random Set Method in Reliability Analysis of Deep Excavations
Authors: Arefeh Arabaninezhad, Ali Fakher
Abstract:
Since the deterministic analysis methods fail to take system uncertainties into account, probabilistic and non-probabilistic methods are suggested. Geotechnical analyses are used to determine the stress and deformation caused by construction; accordingly, many input variables which depend on ground behavior are required for geotechnical analyses. The Random Set approach is an applicable reliability analysis method when comprehensive sources of information are not available. Using Random Set method, with relatively small number of simulations compared to fully probabilistic methods, smooth extremes on system responses are obtained. Therefore random set approach has been proposed for reliability analysis in geotechnical problems. In the present study, the application of random set method in reliability analysis of deep excavations is investigated through three deep excavation projects which were monitored during the excavating process. A finite element code is utilized for numerical modeling. Two expected ranges, from different sources of information, are established for each input variable, and a specific probability assignment is defined for each range. To determine the most influential input variables and subsequently reducing the number of required finite element calculations, sensitivity analysis is carried out. Input data for finite element model are obtained by combining the upper and lower bounds of the input variables. The relevant probability share of each finite element calculation is determined considering the probability assigned to input variables present in these combinations. Horizontal displacement of the top point of excavation is considered as the main response of the system. The result of reliability analysis for each intended deep excavation is presented by constructing the Belief and Plausibility distribution function (i.e. lower and upper bounds) of system response obtained from deterministic finite element calculations. To evaluate the quality of input variables as well as applied reliability analysis method, the range of displacements extracted from models has been compared to the in situ measurements and good agreement is observed. The comparison also showed that Random Set Finite Element Method applies to estimate the horizontal displacement of the top point of deep excavation. Finally, the probability of failure or unsatisfactory performance of the system is evaluated by comparing the threshold displacement with reliability analysis results.Keywords: deep excavation, random set finite element method, reliability analysis, uncertainty
Procedia PDF Downloads 26826939 Deformation Analysis of Pneumatized Sphenoid Bone Caused Due to Elevated Intracranial Pressure Using Finite Element Analysis
Authors: Dilesh Mogre, Jitendra Toravi, Saurabh Joshi, Prutha Deshpande, Aishwarya Kura
Abstract:
In earlier days of technology, it was not possible to understand the nature of complex biomedical problems and were only left to clinical postulations. With advancement in science today, we have tools like Finite Element Modelling and simulation to solve complex biomedical problems. This paper presents how ANSYS WORKBENCH can be used to study deformation of pneumatized sphenoid bone caused by increased intracranial pressure. Intracranial pressure refers to the pressure inside the skull. The increase in the pressure above the normal range of 15mmhg can lead to serious conditions due to developed stresses and deformation. One of the areas where the deformation is suspected to occur is Sphenoid Bone. Moreover, the varying degree of pneumatization increases the complexity of the conditions. It is necessary to study deformation patterns on pneumatized sphenoid bone model at elevated intracranial pressure. Finite Element Analysis plays a major role in developing and analyzing model and give quantitative results.Keywords: intracranial pressure, pneumatized sphenoid bone, deformation, finite element analysis
Procedia PDF Downloads 19426938 Elastic Stress Analysis of Annular Bi-Material Discs with Variable Thickness under Mechanical and Thermomechanical Loads
Authors: Erhan Çetin, Ali Kurşun, Şafak Aksoy, Merve Tunay Çetin
Abstract:
The closed form study deal with elastic stress analysis of annular bi-material discs with variable thickness subjected to the mechanical and termomechanical loads. Those discs have many applications in the aerospace industry, such as gas turbines and gears. Those discs normally work under thermal and mechanical loads. Their life cycle can increase when stress components are minimized. Each material property is assumed to be isotropic. The results show that material combinations and thickness profiles play an important role in determining the responses of bi-material discs and an optimal design of those structures. Stress distribution is investigated and results are shown as graphs.Keywords: bi-material discs, elastic stress analysis, mechanical loads, rotating discs
Procedia PDF Downloads 32826937 Reliability-Based Method for Assessing Liquefaction Potential of Soils
Authors: Mehran Naghizaderokni, Asscar Janalizadechobbasty
Abstract:
This paper explores probabilistic method for assessing the liquefaction potential of sandy soils. The current simplified methods for assessing soil liquefaction potential use a deterministic safety factor in order to determine whether liquefaction will occur or not. However, these methods are unable to determine the liquefaction probability related to a safety factor. A solution to this problem can be found by reliability analysis.This paper presents a reliability analysis method based on the popular certain liquefaction analysis method. The proposed probabilistic method is formulated based on the results of reliability analyses of 190 field records and observations of soil performance against liquefaction. The results of the present study show that confidence coefficient greater and smaller than 1 does not mean safety and/or liquefaction in cadence for liquefaction, and for assuring liquefaction probability, reliability based method analysis should be used. This reliability method uses the empirical acceleration attenuation law in the Chalos area to derive the probability density distribution function and the statistics for the earthquake-induced cyclic shear stress ratio (CSR). The CSR and CRR statistics are used in continuity with the first order and second moment method to calculate the relation between the liquefaction probability, the safety factor and the reliability index. Based on the proposed method, the liquefaction probability related to a safety factor can be easily calculated. The influence of some of the soil parameters on the liquefaction probability can be quantitatively evaluated.Keywords: liquefaction, reliability analysis, chalos area, civil and structural engineering
Procedia PDF Downloads 47026936 Computational Fluid Dynamics Analysis of a Biomass Burner Gas Chamber in OpenFOAM
Authors: Óscar Alfonso Gómez Sepúlveda, Julián Ernesto Jaramillo, Diego Camilo Durán
Abstract:
The global climate crisis has affected different aspects of human life, and in an effort to reverse the effects generated, we seek to optimize and improve the equipment and plants that produce high emissions of CO₂, being possible to achieve this through numerical simulations. These equipments include biomass combustion chambers. The objective of this research is to visualize the thermal behavior of a gas chamber that is used in the process of obtaining vegetable extracts. The simulation is carried out with OpenFOAM taking into account the conservation of energy, turbulence, and radiation; for the purposes of the simulation, combustion is omitted and replaced by heat generation. Within the results, the streamlines generated by the primary and secondary flows are analyzed in order to visualize whether they generate the expected effect, and the energy is used to the maximum. The inclusion of radiation seeks to compare its influence and also simplify the computational times to perform mesh analysis. An analysis is carried out with simplified geometries and with experimental data to corroborate the selection of the models to be used, and it is obtained that for turbulence, the appropriate one is the standard k - w. As a means of verification, a general energy balance is made and compared with the results of the numerical analysis, where the error is 1.67%, which is considered acceptable. From the approach to improvement options, it was found that with the implementation of fins, heat can be increased by up to 7.3%.Keywords: CFD analysis, biomass, heat transfer, radiation, OpenFOAM
Procedia PDF Downloads 11826935 Analysis of Dynamics Underlying the Observation Time Series by Using a Singular Spectrum Approach
Authors: O. Delage, H. Bencherif, T. Portafaix, A. Bourdier
Abstract:
The main purpose of time series analysis is to learn about the dynamics behind some time ordered measurement data. Two approaches are used in the literature to get a better knowledge of the dynamics contained in observation data sequences. The first of these approaches concerns time series decomposition, which is an important analysis step allowing patterns and behaviors to be extracted as components providing insight into the mechanisms producing the time series. As in many cases, time series are short, noisy, and non-stationary. To provide components which are physically meaningful, methods such as Empirical Mode Decomposition (EMD), Empirical Wavelet Transform (EWT) or, more recently, Empirical Adaptive Wavelet Decomposition (EAWD) have been proposed. The second approach is to reconstruct the dynamics underlying the time series as a trajectory in state space by mapping a time series into a set of Rᵐ lag vectors by using the method of delays (MOD). Takens has proved that the trajectory obtained with the MOD technic is equivalent to the trajectory representing the dynamics behind the original time series. This work introduces the singular spectrum decomposition (SSD), which is a new adaptive method for decomposing non-linear and non-stationary time series in narrow-banded components. This method takes its origin from singular spectrum analysis (SSA), a nonparametric spectral estimation method used for the analysis and prediction of time series. As the first step of SSD is to constitute a trajectory matrix by embedding a one-dimensional time series into a set of lagged vectors, SSD can also be seen as a reconstruction method like MOD. We will first give a brief overview of the existing decomposition methods (EMD-EWT-EAWD). The SSD method will then be described in detail and applied to experimental time series of observations resulting from total columns of ozone measurements. The results obtained will be compared with those provided by the previously mentioned decomposition methods. We will also compare the reconstruction qualities of the observed dynamics obtained from the SSD and MOD methods.Keywords: time series analysis, adaptive time series decomposition, wavelet, phase space reconstruction, singular spectrum analysis
Procedia PDF Downloads 10426934 Ripple Effect Analysis of Government Investment for Research and Development by the Artificial Neural Networks
Authors: Hwayeon Song
Abstract:
The long-term purpose of research and development (R&D) programs is to strengthen national competitiveness by developing new knowledge and technologies. Thus, it is important to determine a proper budget for government programs to maintain the vigor of R&D when the total funding is tight due to the national deficit. In this regard, a ripple effect analysis for the budgetary changes in R&D programs is necessary as well as an investigation of the current status. This study proposes a new approach using Artificial Neural Networks (ANN) for both tasks. It particularly focuses on R&D programs related to Construction and Transportation (C&T) technology in Korea. First, key factors in C&T technology are explored to draw impact indicators in three areas: economy, society, and science and technology (S&T). Simultaneously, ANN is employed to evaluate the relationship between data variables. From this process, four major components in R&D including research personnel, expenses, management, and equipment are assessed. Then the ripple effect analysis is performed to see the changes in the hypothetical future by modifying current data. Any research findings can offer an alternative strategy about R&D programs as well as a new analysis tool.Keywords: Artificial Neural Networks, construction and transportation technology, Government Research and Development, Ripple Effect
Procedia PDF Downloads 24726933 Purchasing Decision-Making in Supply Chain Management: A Bibliometric Analysis
Authors: Ahlem Dhahri, Waleed Omri, Audrey Becuwe, Abdelwahed Omri
Abstract:
In industrial processes, decision-making ranges across different scales, from process control to supply chain management. The purchasing decision-making process in the supply chain is presently gaining more attention as a critical contributor to the company's strategic success. Given the scarcity of thorough summaries in the prior studies, this bibliometric analysis aims to adopt a meticulous approach to achieve quantitative knowledge on the constantly evolving subject of purchasing decision-making in supply chain management. Through bibliometric analysis, we examine a sample of 358 peer-reviewed articles from the Scopus database. VOSviewer and Gephi software were employed to analyze, combine, and visualize the data. Data analytic techniques, including citation network, page-rank analysis, co-citation, and publication trends, have been used to identify influential works and outline the discipline's intellectual structure. The outcomes of this descriptive analysis highlight the most prominent articles, authors, journals, and countries based on their citations and publications. The findings from the research illustrate an increase in the number of publications, exhibiting a slightly growing trend in this field. Co-citation analysis coupled with content analysis of the most cited articles identified five research themes mentioned as follows integrating sustainability into the supplier selection process, supplier selection under disruption risks assessment and mitigation strategies, Fuzzy MCDM approaches for supplier evaluation and selection, purchasing decision in vendor problems, decision-making techniques in supplier selection and order lot sizing problems. With the help of a graphic timeline, this exhaustive map of the field illustrates a visual representation of the evolution of publications that demonstrate a gradual shift from research interest in vendor selection problems to integrating sustainability in the supplier selection process. These clusters offer insights into a wide variety of purchasing methods and conceptual frameworks that have emerged; however, they have not been validated empirically. The findings suggest that future research would emerge with a greater depth of practical and empirical analysis to enrich the theories. These outcomes provide a powerful road map for further study in this area.Keywords: bibliometric analysis, citation analysis, co-citation, Gephi, network analysis, purchasing, SCM, VOSviewer
Procedia PDF Downloads 8526932 Economic Forecasting Analysis for Solar Photovoltaic Application
Authors: Enas R. Shouman
Abstract:
Economic development with population growth is leading to a continuous increase in energy demand. At the same time, growing global concern for the environment is driving to decrease the use of conventional energy sources and to increase the use of renewable energy sources. The objective of this study is to present the market trends of solar energy photovoltaic technology over the world and to represent economics methods for PV financial analyzes on the basis of expectations for the expansion of PV in many applications. In the course of this study, detailed information about the current PV market was gathered and analyzed to find factors influencing the penetration of PV energy. The paper methodology depended on five relevant economic financial analysis methods that are often used for investment decisions maker. These methods are payback analysis, net benefit analysis, saving-to-investment ratio, adjusted internal rate of return, and life-cycle cost. The results of this study may be considered as a marketing guide that helps diffusion of using PV Energy. The study showed that PV cost is economically reliable. The consumers will pay higher purchase prices for PV system installation but will get lower electricity bill.Keywords: photovoltaic, financial methods, solar energy, economics, PV panel
Procedia PDF Downloads 109