Search results for: sub channel analysis
26787 NOx Emission and Computational Analysis of Jatropha Curcus Fuel and Crude Oil
Authors: Vipan Kumar Sohpal, Rajesh K Sharma
Abstract:
Diminishing of conventional fuels and hysterical vehicles emission leads to deterioration of the environment, which emphasize the research to work on biofuels. Biofuels from different sources attract the attention of research due to low emission and biodegradability. Emission of carbon monoxide, carbon dioxide and H-C reduced drastically using Biofuels (B-20) combustion. Contrary to the conventional fuel, engine emission results indicated that nitrous oxide emission is higher in Biofuels. So this paper examines and compares the nitrogen oxide emission of Jatropha Curcus (JCO) B-20% blends with the vegetable oil. In addition to that computational analysis of crude non edible oil performed to assess the impact of composition on emission quality. In conclusion, JCO have the potential feedstock for the biodiesel production after the genetic modification in the plant.Keywords: jatropha curcus, computational analysis, emissions, NOx biofuels
Procedia PDF Downloads 58726786 Understanding Mathematics Achievements among U. S. Middle School Students: A Bayesian Multilevel Modeling Analysis with Informative Priors
Authors: Jing Yuan, Hongwei Yang
Abstract:
This paper aims to understand U.S. middle school students’ mathematics achievements by examining relevant student and school-level predictors. Through a variance component analysis, the study first identifies evidence supporting the use of multilevel modeling. Then, a multilevel analysis is performed under Bayesian statistical inference where prior information is incorporated into the modeling process. During the analysis, independent variables are entered sequentially in the order of theoretical importance to create a hierarchy of models. By evaluating each model using Bayesian fit indices, a best-fit and most parsimonious model is selected where Bayesian statistical inference is performed for the purpose of result interpretation and discussion. The primary dataset for Bayesian modeling is derived from the Program for International Student Assessment (PISA) in 2012 with a secondary PISA dataset from 2003 analyzed under the traditional ordinary least squares method to provide the information needed to specify informative priors for a subset of the model parameters. The dependent variable is a composite measure of mathematics literacy, calculated from an exploratory factor analysis of all five PISA 2012 mathematics achievement plausible values for which multiple evidences are found supporting data unidimensionality. The independent variables include demographics variables and content-specific variables: mathematics efficacy, teacher-student ratio, proportion of girls in the school, etc. Finally, the entire analysis is performed using the MCMCpack and MCMCglmm packages in R.Keywords: Bayesian multilevel modeling, mathematics education, PISA, multilevel
Procedia PDF Downloads 33626785 Copula-Based Estimation of Direct and Indirect Effects in Path Analysis Models
Authors: Alam Ali, Ashok Kumar Pathak
Abstract:
Path analysis is a statistical technique used to evaluate the direct and indirect effects of variables in path models. One or more structural regression equations are used to estimate a series of parameters in path models to find the better fit of data. However, sometimes the assumptions of classical regression models, such as ordinary least squares (OLS), are violated by the nature of the data, resulting in insignificant direct and indirect effects of exogenous variables. This article aims to explore the effectiveness of a copula-based regression approach as an alternative to classical regression, specifically when variables are linked through an elliptical copula.Keywords: path analysis, copula-based regression models, direct and indirect effects, k-fold cross validation technique
Procedia PDF Downloads 4226784 A Microcosm Study on the Response of Phytoplankton and Bacterial Community of the Subarctic Northeast Atlantic Ocean to Oil Pollution under Projected Atmospheric CO₂ Conditions
Authors: Afiq Mohd Fahmi, Tony Gutierrez, Sebastian Hennige
Abstract:
Increasing amounts of CO₂ entering the marine environment, also known as ocean acidification, is documented as having harmful impacts on a variety of marine organisms. When considering the future risk of hydrocarbon pollution, which is generally detrimental to marine life as well, this needs to consider how OA-induced changes to microbial communities will compound this since hydrocarbon degradation is influenced by the community-level microbial response. This study aims to evaluate the effects of increased atmospheric CO₂ conditions and oil enrichment on the phytoplankton-associated bacterial communities. Faroe Shetland Channel (FSC) is a subarctic region in the northeast Atlantic where crude oil extraction has recently been expanded. In the event of a major oil spill in this region, it is vital that we understand the response of the bacterial community and its consequence on primary production within this region—some phytoplankton communities found in the ocean harbor hydrocarbon-degrading bacteria that are associated with its psychosphere. Surface water containing phytoplankton and bacteria from FSC were cultured in ambient and elevated atmospheric CO₂ conditions for 4 days of acclimation in microcosms before introducing 1% (v/v) of crude oil into the microcosms to simulate oil spill conditions at sea. It was found that elevated CO₂ conditions do not significantly affect the chl a concentration, and exposure to crude oil detrimentally affected chl a concentration up to 10 days after exposure to crude oil. The diversity and richness of the bacterial community were not significantly affected by both CO₂ treatment and oil enrichment. The increase in the relative abundance of known hydrocarbon degraders such as Oleispira, Marinobacter and Halomonas indicates potential for biodegradation of crude oil, while the resilience of dominant taxa Colwellia, unclassified Gammaproteobacteria, unclassified Rnodobacteria and unclassified Halomonadaceae could be associated with the recovery of microalgal community 13 days after oil exposure. Therefore, the microbial community from the subsurface of FSC has the potential to recover from crude oil pollution even under elevated CO₂ (750 ppm) conditions.Keywords: phytoplankton, bacteria, crude oil, ocean acidification
Procedia PDF Downloads 23726783 Mitigation of Size Effects in Woven Fabric Composites Using Finite Element Analysis Approach
Authors: Azeez Shaik, Yagnik Kalariya, Amit Salvi
Abstract:
High-performance requirements and emission norms were forcing the automobile industry to opt for lightweight materials which improve the fuel efficiency and absorb energy during crash applications. In such scenario, the woven fabric composites are providing better energy absorption compared to metals. Woven fabric composites have a repetitive unit cell (RUC) and the mechanical properties of these materials are highly dependent on RUC. This work investigates the importance of detailed modelling of the RUC, the size effects associated and the mitigation techniques to avoid them using Finite element analysis approach.Keywords: repetitive unit cell, representative volume element, size effects, cohesive zone, finite element analysis
Procedia PDF Downloads 25526782 Urea and Starch Detection on a Paper-Based Microfluidic Device Enabled on a Smartphone
Authors: Shashank Kumar, Mansi Chandra, Ujjawal Singh, Parth Gupta, Rishi Ram, Arnab Sarkar
Abstract:
Milk is one of the basic and primary sources of food and energy as we start consuming milk from birth. Hence, milk quality and purity and checking the concentration of its constituents become necessary steps. Considering the importance of the purity of milk for human health, the following study has been carried out to simultaneously detect and quantify the different adulterants like urea and starch in milk with the help of a paper-based microfluidic device integrated with a smartphone. The detection of the concentration of urea and starch is based on the principle of colorimetry. In contrast, the fluid flow in the device is based on the capillary action of porous media. The microfluidic channel proposed in the study is equipped with a specialized detection zone, and it employs a colorimetric indicator undergoing a visible color change when the milk gets in touch or reacts with a set of reagents which confirms the presence of different adulterants in the milk. In our proposed work, we have used iodine to detect the percentage of starch in the milk, whereas, in the case of urea, we have used the p-DMAB. A direct correlation has been found between the color change intensity and the concentration of adulterants. A calibration curve was constructed to find color intensity and subsequent starch and urea concentration. The device has low-cost production and easy disposability, which make it highly suitable for widespread adoption, especially in resource-constrained settings. Moreover, a smartphone application has been developed to detect, capture, and analyze the change in color intensity due to the presence of adulterants in the milk. The low-cost nature of the smartphone-integrated paper-based sensor, coupled with its integration with smartphones, makes it an attractive solution for widespread use. They are affordable, simple to use, and do not require specialized training, making them ideal tools for regulatory bodies and concerned consumers.Keywords: paper based microfluidic device, milk adulteration, urea detection, starch detection, smartphone application
Procedia PDF Downloads 6526781 The Moderation Effect of Critical Item on the Strategic Purchasing: Quality Performance Relationship
Authors: Kwong Yeung
Abstract:
Theories about strategic purchasing and quality performance are underdeveloped. Understanding the evolving role of purchasing from reactive to proactive is a pressing strategic issue. Using survey responses from 176 manufacturing and electronics industry professionals, we study the relationships between strategic purchasing and supply chain partners’ quality performance to answer the following questions: Can transaction cost economics be used to elucidate the strategic purchasing-quality performance relationship? Is this strategic purchasing-quality performance relationship moderated by critical item analysis? The findings indicate that critical item analysis positively and significantly moderates the strategic purchasing-quality performance relationship.Keywords: critical item analysis, moderation, quality performance, strategic purchasing, transaction cost economics
Procedia PDF Downloads 56326780 Coordinated Voltage Control in a Radial Distribution System
Authors: Shivarudraswamy, Anubhav Shrivastava, Lakshya Bhat
Abstract:
Distributed generation has indeed become a major area of interest in recent years. Distributed Generation can address large number of loads in a power line and hence has better efficiency over the conventional methods. However there are certain drawbacks associated with it, increase in voltage being the major one. This paper addresses the voltage control at the buses for an IEEE 30 bus system by regulating reactive power. For carrying out the analysis, the suitable location for placing distributed generators (DG) is identified through load flow analysis and seeing where the voltage profile is dipping. MATLAB programming is used to regulate the voltage at all buses within +/-5% of the base value even after the introduction of DG’s. Three methods for regulation of voltage are discussed. A sensitivity based analysis is later carried out to determine the priority among the various methods listed in the paper.Keywords: distributed generators, distributed system, reactive power, voltage control
Procedia PDF Downloads 50026779 An Analysis of Discourse Markers Awareness in Writing Undergraduate Thesis of English Education Student in Sebelas Maret University
Authors: Oktanika Wahyu Nurjanah, Anggun Fitriana Dewi
Abstract:
An undergraduate thesis is one of the academic writings which should fulfill some characteristics, one of them is coherency. Moreover, a coherence of a text depends on the usage of discourse markers. In other word, discourse markers take an essential role in writing. Therefore, the researchers aim to know the awareness of the discourse markers usage in writing the under-graduate thesis of an English Education student at Sebelas Maret University. This research uses a qualitative case study in order to obtain a deep analysis. The sample of this research is an under-graduate thesis of English Education student in Sebelas Maret University which chosen based on some criteria. Additionally, the researchers were guided by some literature attempted to group the discourse markers based on their functions. Afterward, the analysis was held based on it. From the analysis, it found that the awareness of discourse markers usage is moderate. The last point, the researcher suggest undergraduate students to familiarize themselves with discourse markers, especially for those who want to write thesis.Keywords: discourse markers, English education, thesis writing, undergraduate student
Procedia PDF Downloads 35726778 Resource Constrained Time-Cost Trade-Off Analysis in Construction Project Planning and Control
Authors: Sangwon Han, Chengquan Jin
Abstract:
Time-cost trade-off (TCTO) is one of the most significant part of construction project management. Despite the significance, current TCTO analysis, based on the Critical Path Method, does not consider resource constraint, and accordingly sometimes generates an impractical and/or infeasible schedule planning in terms of resource availability. Therefore, resource constraint needs to be considered when doing TCTO analysis. In this research, genetic algorithms (GA) based optimization model is created in order to find the optimal schedule. This model is utilized to compare four distinct scenarios (i.e., 1) initial CPM, 2) TCTO without considering resource constraint, 3) resource allocation after TCTO, and 4) TCTO with considering resource constraint) in terms of duration, cost, and resource utilization. The comparison results identify that ‘TCTO with considering resource constraint’ generates the optimal schedule with the respect of duration, cost, and resource. This verifies the need for consideration of resource constraint when doing TCTO analysis. It is expected that the proposed model will produce more feasible and optimal schedule.Keywords: time-cost trade-off, genetic algorithms, critical path, resource availability
Procedia PDF Downloads 18726777 Longitudinal Analysis of Internet Speed Data in the Gulf Cooperation Council Region
Authors: Musab Isah
Abstract:
This paper presents a longitudinal analysis of Internet speed data in the Gulf Cooperation Council (GCC) region, focusing on the most populous cities of each of the six countries – Riyadh, Saudi Arabia; Dubai, UAE; Kuwait City, Kuwait; Doha, Qatar; Manama, Bahrain; and Muscat, Oman. The study utilizes data collected from the Measurement Lab (M-Lab) infrastructure over a five-year period from January 1, 2019, to December 31, 2023. The analysis includes downstream and upstream throughput data for the cities, covering significant events such as the launch of 5G networks in 2019, COVID-19-induced lockdowns in 2020 and 2021, and the subsequent recovery period and return to normalcy. The results showcase substantial increases in Internet speeds across the cities, highlighting improvements in both download and upload throughput over the years. All the GCC countries have achieved above-average Internet speeds that can conveniently support various online activities and applications with excellent user experience.Keywords: internet data science, internet performance measurement, throughput analysis, internet speed, measurement lab, network diagnostic tool
Procedia PDF Downloads 6226776 Integrating Data Envelopment Analysis and Variance Inflation Factor to Measure the Efficiency of Decision Making Units
Authors: Mostafa Kazemi, Zahra N. Farkhani
Abstract:
This paper proposes an integrated Data Envelopment Analysis (DEA) and Variance Inflation Factor (VIF) model for measuring the technical efficiency of decision making units. The model is validated using a set of 69% sales representatives’ dairy products. The analysis is done in two stages, in the first stage, VIF technique is used to distinguish independent effective factors of resellers, and in the second stage we used DEA for measuring efficiency for both constant and variable return to scales status. Further DEA is used to examine the utilization of environmental factors on efficiency. Results of this paper indicated an average managerial efficiency of 83% in the whole sales representatives’ dairy products. In addition, technical and scale efficiency were counted 96% and 80% respectively. 38% of sales representative have the technical efficiency of 100% and 72% of the sales representative in terms of managerial efficiency are quite efficient.High levels of relative efficiency indicate a good condition for sales representative efficiency.Keywords: data envelopment analysis (DEA), relative efficiency, sales representatives’ dairy products, variance inflation factor (VIF)
Procedia PDF Downloads 56826775 Effects of Small Amount of Poly(D-Lactic Acid) on the Properties of Poly(L-Lactic Acid)/Microcrystalline Cellulose/Poly(D-Lactic Acid) Blends
Authors: Md. Hafezur Rahaman, Md. Sagor Hosen, Md. Abdul Gafur, Rasel Habib
Abstract:
This research is a systematic study of effects of poly(D-lactic acid) (PDLA) on the properties of poly(L-lactic acid)(PLLA)/microcrystalline cellulose (MCC)/PDLA blends by stereo complex crystallization. Blends were prepared with constant percentage of (3 percent) MCC and different percentage of PDLA by solution casting methods. These blends were characterized by Fourier Transform Infrared Spectroscopy (FTIR) for the confirmation of blends compatibility, Wide-Angle X-ray Scattering (WAXS) and scanning electron microscope (SEM) for the analysis of morphology, thermo-gravimetric analysis (TGA) and differential thermal analysis (DTA) for thermal properties measurement. FTIR Analysis results confirm no new characteristic absorption peaks appeared in the spectrum instead shifting of peaks due to hydrogen bonding help to have compatibility of blends component. Development of three new peaks from XRD analysis indicates strongly the formation of stereo complex crystallinity in the PLLA structure with the addition of PDLA. TGA and DTG results indicate that PDLA can improve the heat resistivity of the PLLA/MCC blends by increasing its degradation temperature. Comparison of DTA peaks also ensure developed thermal properties. Image of SEM shows the improvement of surface morphology.Keywords: microcrystalline cellulose, poly(l-lactic acid), stereocomplex crystallization, thermal stability
Procedia PDF Downloads 13526774 Identify the Renewable Energy Potential through Sustainability Indicators and Multicriteria Analysis
Authors: Camila Lima, Murilo Andrade Valle, Patrícia Teixeira Leite Asano
Abstract:
The growth in demand for electricity, caused by human development, depletion and environmental impacts caused by traditional sources of electricity generation have made new energy sources are increasingly encouraged and necessary for companies in the electricity sector. Based on this scenario, this paper assesses the negative environmental impacts associated with thermoelectric power plants in Brazil, pointing out the importance of using renewable energy sources, reducing environmental aggression. This article points out the existence of an energy alternative, wind energy, of the municipalities of São Paulo, represented by georeferenced maps with the help of GIS, using as a premise the indicators of sustainability and multicriteria analysis in the decision-making process.Keywords: GIS (geographic information systems), multicriteria analysis, sustainability, wind energy
Procedia PDF Downloads 36526773 The Non-Linear Analysis of Brain Response to Visual Stimuli
Authors: H. Namazi, H. T. N. Kuan
Abstract:
Brain activity can be measured by acquiring and analyzing EEG signals from an individual. In fact, the human brain response to external and internal stimuli is mapped in his EEG signals. During years some methods such as Fourier transform, wavelet transform, empirical mode decomposition, etc. have been used to analyze the EEG signals in order to find the effect of stimuli, especially external stimuli. But each of these methods has some weak points in analysis of EEG signals. For instance, Fourier transform and wavelet transform methods are linear signal analysis methods which are not good to be used for analysis of EEG signals as nonlinear signals. In this research we analyze the brain response to visual stimuli by extracting information in the form of various measures from EEG signals using a software developed by our research group. The used measures are Jeffrey’s measure, Fractal dimension and Hurst exponent. The results of these analyses are useful not only for fundamental understanding of brain response to visual stimuli but provide us with very good recommendations for clinical purposes.Keywords: visual stimuli, brain response, EEG signal, fractal dimension, hurst exponent, Jeffrey’s measure
Procedia PDF Downloads 56126772 Static and Dynamical Analysis on Clutch Discs on Different Material and Geometries
Authors: Jairo Aparecido Martins, Estaner Claro Romão
Abstract:
This paper presents the static and cyclic stresses in combination with fatigue analysis resultant of loads applied on the friction discs usually utilized on industrial clutches. The material chosen to simulate the friction discs under load is aluminum. The numerical simulation was done by software COMSOLTM Multiphysics. The results obtained for static loads showed enough stiffness for both geometries and the material utilized. On the other hand, in the fatigue standpoint, failure is clearly verified, what demonstrates the importance of both approaches, mainly dynamical analysis. The results and the conclusion are based on the stresses on disc, counted stress cycles, and fatigue usage factor.Keywords: aluminum, industrial clutch, static and dynamic loading, numerical simulation
Procedia PDF Downloads 18826771 Hydrology and Hydraulics Analysis of Beko Abo Dam and Appurtenant Structre Design, Ethiopia
Authors: Azazhu Wassie
Abstract:
This study tried to evaluate the maximum design flood for appurtenance structure design using the given climatological and hydrological data analysis on the referenced study area. The maximum design flood is determined by using flood frequency analysis. Using this method, the peak discharge is 32,583.67 m3/s, but the data is transferred because the dam site is not on the gauged station. Then the peak discharge becomes 38,115 m3/s. The study was conducted in June 2023. This dam is built across a river to create a reservoir on its upstream side for impounding water. The water stored in the reservoir is used for various purposes, such as irrigation, hydropower, navigation, fishing, etc. The total average volume of annual runoff is estimated to be 115.1 billion m3. The total potential of the land for irrigation development can go beyond 3 million ha.Keywords: dam design, flow duration curve, peak flood, rainfall, reservoir capacity, risk and reliability
Procedia PDF Downloads 2826770 Study of Mobile Game Addiction Using Electroencephalography Data Analysis
Authors: Arsalan Ansari, Muhammad Dawood Idrees, Maria Hafeez
Abstract:
Use of mobile phones has been increasing considerably over the past decade. Currently, it is one of the main sources of communication and information. Initially, mobile phones were limited to calls and messages, but with the advent of new technology smart phones were being used for many other purposes including video games. Despite of positive outcomes, addiction to video games on mobile phone has become a leading cause of psychological and physiological problems among many people. Several researchers examined the different aspects of behavior addiction with the use of different scales. Objective of this study is to examine any distinction between mobile game addicted and non-addicted players with the use of electroencephalography (EEG), based upon psycho-physiological indicators. The mobile players were asked to play a mobile game and EEG signals were recorded by BIOPAC equipment with AcqKnowledge as data acquisition software. Electrodes were places, following the 10-20 system. EEG was recorded at sampling rate of 200 samples/sec (12,000samples/min). EEG recordings were obtained from the frontal (Fp1, Fp2), parietal (P3, P4), and occipital (O1, O2) lobes of the brain. The frontal lobe is associated with behavioral control, personality, and emotions. The parietal lobe is involved in perception, understanding logic, and arithmetic. The occipital lobe plays a role in visual tasks. For this study, a 60 second time window was chosen for analysis. Preliminary analysis of the signals was carried out with Acqknowledge software of BIOPAC Systems. From the survey based on CGS manual study 2010, it was concluded that five participants out of fifteen were in addictive category. This was used as prior information to group the addicted and non-addicted by physiological analysis. Statistical analysis showed that by applying clustering analysis technique authors were able to categorize the addicted and non-addicted players specifically on theta frequency range of occipital area.Keywords: mobile game, addiction, psycho-physiology, EEG analysis
Procedia PDF Downloads 16426769 The Relationship between Sleep Traits and Tinnitus in UK Biobank: A Population-Based Cohort Study
Authors: Jiajia Peng, Yijun Dong, Jianjun Ren, Yu Zhao
Abstract:
Objectives: Understanding the association between sleep traits and tinnitus could help prevent and provide appropriate interventions against tinnitus. Therefore, this study aimed to assess the relationship between different sleep patterns and tinnitus. Design: A cross-sectional analysis using baseline data (2006–2010, n=168,064) by logistic regressions was conducted to evaluate the association between sleep traits (including the overall health sleep score and five sleep behaviors), and the occurrence (yes/no), frequency (constant/transient), and severity (upsetting/not upsetting) of tinnitus. Further, a prospective analysis of participants without tinnitus at baseline (n=9,581) was performed, who had been followed up for seven years (2012–2019) to assess the association between new-onset tinnitus and sleep characteristics. Moreover, a subgroup analysis was also carried out to estimate the differences in sex by dividing the participants into male and female groups. A sensitivity analysis was also conducted by excluding ear-related diseases to avoid their confounding effects on tinnitus (n=102,159). Results: In the cross-sectional analysis, participants with “current tinnitus” (OR: 1.13, 95% CI: 1.04–1.22, p=0.004) had a higher risk of having a poor overall healthy sleep score and unhealthy sleep behaviors such as short sleep durations (OR: 1.09, 95% CI: 1.04–1.14, p<0.001), late chronotypes (OR: 1.09, 95% CI: 1.05–1.13, p<0.001), and sleeplessness (OR: 1.16, 95% CI: 1.11–1.22, p<0.001) than those participants who “did not have current tinnitus.” However, this trend was not obvious between “constant tinnitus” and “transient tinnitus.” When considering the severity of tinnitus, the risk of “upsetting tinnitus” was obviously higher if participants had lower overall healthy sleep scores (OR: 1.31, 95% CI: 1.13–1.53, p<0.001). Additionally, short sleep duration (OR: 1.22, 95% CI: 1.12–1.33, p<0.001), late chronotypes (OR: 1.13, 95% CI: 1.04–1.22, p=0.003), and sleeplessness (OR: 1.43, 95% CI: 1.29–1.59, p<0.001) showed positive correlations with “upsetting tinnitus.” In the prospective analysis, sleeplessness presented a consistently significant association with “upsetting tinnitus” (RR: 2.28, P=0.001). Consistent results were observed in the sex subgroup analysis, where a much more pronounced trend was identified in females compared with males. The results of the sensitivity analysis were consistent with those of the cross-sectional and prospective analyses. Conclusions: Different types of sleep disturbance may be associated with the occurrence and severity of tinnitus; therefore, precise interventions for different types of sleep disturbance, particularly sleeplessness, may help in the prevention and treatment of tinnitus.Keywords: tinnitus, sleep, sleep behaviors, sleep disturbance
Procedia PDF Downloads 14226768 Post Pandemic Mobility Analysis through Indexing and Sharding in MongoDB: Performance Optimization and Insights
Authors: Karan Vishavjit, Aakash Lakra, Shafaq Khan
Abstract:
The COVID-19 pandemic has pushed healthcare professionals to use big data analytics as a vital tool for tracking and evaluating the effects of contagious viruses. To effectively analyze huge datasets, efficient NoSQL databases are needed. The analysis of post-COVID-19 health and well-being outcomes and the evaluation of the effectiveness of government efforts during the pandemic is made possible by this research’s integration of several datasets, which cuts down on query processing time and creates predictive visual artifacts. We recommend applying sharding and indexing technologies to improve query effectiveness and scalability as the dataset expands. Effective data retrieval and analysis are made possible by spreading the datasets into a sharded database and doing indexing on individual shards. Analysis of connections between governmental activities, poverty levels, and post-pandemic well being is the key goal. We want to evaluate the effectiveness of governmental initiatives to improve health and lower poverty levels. We will do this by utilising advanced data analysis and visualisations. The findings provide relevant data that supports the advancement of UN sustainable objectives, future pandemic preparation, and evidence-based decision-making. This study shows how Big Data and NoSQL databases may be used to address problems with global health.Keywords: big data, COVID-19, health, indexing, NoSQL, sharding, scalability, well being
Procedia PDF Downloads 7026767 Ergonomical Study of Hand-Arm Vibrational Exposure in a Gear Manufacturing Plant in India
Authors: Santosh Kumar, M. Muralidhar
Abstract:
The term ‘ergonomics’ is derived from two Greek words: ‘ergon’, meaning work and ‘nomoi’, meaning natural laws. Ergonomics is the study of how working conditions, machines and equipment can be arranged in order that people can work with them more efficiently. In this research communication an attempt has been made to study the effect of hand-arm vibrational exposure on the workers of a gear manufacturing plant by comparison of potential Carpal Tunnel Syndrome (CTS) symptoms and effect of different exposure levels of vibration on occurrence of CTS in actual industrial environment. Chi square test and correlation analysis have been considered for statistical analysis. From Chi square test, it has been found that the potential CTS symptoms occurrence is significantly dependent on the level of vibrational exposure. Data analysis indicates that 40.51% workers having potential CTS symptoms are exposed to vibration. Correlation analysis reveals that potential CTS symptoms are significantly correlated with exposure to level of vibration from handheld tools and to repetitive wrist movements.Keywords: CTS symptoms, hand-arm vibration, ergonomics, physical tests
Procedia PDF Downloads 37126766 Holomorphic Prioritization of Sets within Decagram of Strategic Decision Making of POSM Using Operational Research (OR): Analytic Hierarchy Process (AHP) Analysis
Authors: Elias Ogutu Azariah Tembe, Hussain Abdullah Habib Al-Salamin
Abstract:
There is decagram of strategic decisions of operations and production/service management (POSM) within operational research (OR) which must collate, namely: design, inventory, quality, location, process and capacity, layout, scheduling, maintain ace, and supply chain. This paper presents an architectural configuration conceptual framework of a decagram of sets decisions in a form of mathematical complete graph and abelian graph. Mathematically, a complete graph is undirected (UDG), and directed (DG) a relationship where every pair of vertices are connected, collated, confluent, and holomorphic. There has not been any study conducted which, however, prioritizes the holomorphic sets which of POMS within OR field of study. The study utilizes OR structured technique known as The Analytic Hierarchy Process (AHP) analysis for organizing, sorting and prioritizing (ranking) the sets within the decagram of POMS according to their attribution (propensity), and provides an analysis how the prioritization has real-world application within the 21st century.Keywords: holomorphic, decagram, decagon, confluent, complete graph, AHP analysis, SCM, HRM, OR, OM, abelian graph
Procedia PDF Downloads 40226765 A Microsurgery-Specific End-Effector Equipped with a Bipolar Surgical Tool and Haptic Feedback
Authors: Hamidreza Hoshyarmanesh, Sanju Lama, Garnette R. Sutherland
Abstract:
In tele-operative robotic surgery, an ideal haptic device should be equipped with an intuitive and smooth end-effector to cover the surgeon’s hand/wrist degrees of freedom (DOF) and translate the hand joint motions to the end-effector of the remote manipulator with low effort and high level of comfort. This research introduces the design and development of a microsurgery-specific end-effector, a gimbal mechanism possessing 4 passive and 1 active DOFs, equipped with a bipolar forceps and haptic feedback. The robust gimbal structure is comprised of three light-weight links/joint, pitch, yaw, and roll, each consisting of low-friction support and a 2-channel accurate optical position sensor. The third link, which provides the tool roll, was specifically designed to grip the tool prongs and accommodate a low mass geared actuator together with a miniaturized capstan-rope mechanism. The actuator is able to generate delicate torques, using a threaded cylindrical capstan, to emulate the sense of pinch/coagulation during conventional microsurgery. While the tool left prong is fixed to the rolling link, the right prong bears a miniaturized drum sector with a large diameter to expand the force scale and resolution. The drum transmits the actuator output torque to the right prong and generates haptic force feedback at the tool level. The tool is also equipped with a hall-effect sensor and magnet bar installed vis-à-vis on the inner side of the two prongs to measure the tooltip distance and provide an analogue signal to the control system. We believe that such a haptic end-effector could significantly increase the accuracy of telerobotic surgery and help avoid high forces that are known to cause bleeding/injury.Keywords: end-effector, force generation, haptic interface, robotic surgery, surgical tool, tele-operation
Procedia PDF Downloads 11826764 The Use of Random Set Method in Reliability Analysis of Deep Excavations
Authors: Arefeh Arabaninezhad, Ali Fakher
Abstract:
Since the deterministic analysis methods fail to take system uncertainties into account, probabilistic and non-probabilistic methods are suggested. Geotechnical analyses are used to determine the stress and deformation caused by construction; accordingly, many input variables which depend on ground behavior are required for geotechnical analyses. The Random Set approach is an applicable reliability analysis method when comprehensive sources of information are not available. Using Random Set method, with relatively small number of simulations compared to fully probabilistic methods, smooth extremes on system responses are obtained. Therefore random set approach has been proposed for reliability analysis in geotechnical problems. In the present study, the application of random set method in reliability analysis of deep excavations is investigated through three deep excavation projects which were monitored during the excavating process. A finite element code is utilized for numerical modeling. Two expected ranges, from different sources of information, are established for each input variable, and a specific probability assignment is defined for each range. To determine the most influential input variables and subsequently reducing the number of required finite element calculations, sensitivity analysis is carried out. Input data for finite element model are obtained by combining the upper and lower bounds of the input variables. The relevant probability share of each finite element calculation is determined considering the probability assigned to input variables present in these combinations. Horizontal displacement of the top point of excavation is considered as the main response of the system. The result of reliability analysis for each intended deep excavation is presented by constructing the Belief and Plausibility distribution function (i.e. lower and upper bounds) of system response obtained from deterministic finite element calculations. To evaluate the quality of input variables as well as applied reliability analysis method, the range of displacements extracted from models has been compared to the in situ measurements and good agreement is observed. The comparison also showed that Random Set Finite Element Method applies to estimate the horizontal displacement of the top point of deep excavation. Finally, the probability of failure or unsatisfactory performance of the system is evaluated by comparing the threshold displacement with reliability analysis results.Keywords: deep excavation, random set finite element method, reliability analysis, uncertainty
Procedia PDF Downloads 26826763 Deformation Analysis of Pneumatized Sphenoid Bone Caused Due to Elevated Intracranial Pressure Using Finite Element Analysis
Authors: Dilesh Mogre, Jitendra Toravi, Saurabh Joshi, Prutha Deshpande, Aishwarya Kura
Abstract:
In earlier days of technology, it was not possible to understand the nature of complex biomedical problems and were only left to clinical postulations. With advancement in science today, we have tools like Finite Element Modelling and simulation to solve complex biomedical problems. This paper presents how ANSYS WORKBENCH can be used to study deformation of pneumatized sphenoid bone caused by increased intracranial pressure. Intracranial pressure refers to the pressure inside the skull. The increase in the pressure above the normal range of 15mmhg can lead to serious conditions due to developed stresses and deformation. One of the areas where the deformation is suspected to occur is Sphenoid Bone. Moreover, the varying degree of pneumatization increases the complexity of the conditions. It is necessary to study deformation patterns on pneumatized sphenoid bone model at elevated intracranial pressure. Finite Element Analysis plays a major role in developing and analyzing model and give quantitative results.Keywords: intracranial pressure, pneumatized sphenoid bone, deformation, finite element analysis
Procedia PDF Downloads 19426762 A Coupled Model for Two-Phase Simulation of a Heavy Water Pressure Vessel Reactor
Authors: D. Ramajo, S. Corzo, M. Nigro
Abstract:
A Multi-dimensional computational fluid dynamics (CFD) two-phase model was developed with the aim to simulate the in-core coolant circuit of a pressurized heavy water reactor (PHWR) of a commercial nuclear power plant (NPP). Due to the fact that this PHWR is a Reactor Pressure Vessel type (RPV), three-dimensional (3D) detailed modelling of the large reservoirs of the RPV (the upper and lower plenums and the downcomer) were coupled with an in-house finite volume one-dimensional (1D) code in order to model the 451 coolant channels housing the nuclear fuel. Regarding the 1D code, suitable empirical correlations for taking into account the in-channel distributed (friction losses) and concentrated (spacer grids, inlet and outlet throttles) pressure losses were used. A local power distribution at each one of the coolant channels was also taken into account. The heat transfer between the coolant and the surrounding moderator was accurately calculated using a two-dimensional theoretical model. The implementation of subcooled boiling and condensation models in the 1D code along with the use of functions for representing the thermal and dynamic properties of the coolant and moderator (heavy water) allow to have estimations of the in-core steam generation under nominal flow conditions for a generic fission power distribution. The in-core mass flow distribution results for steady state nominal conditions are in agreement with the expected from design, thus getting a first assessment of the coupled 1/3D model. Results for nominal condition were compared with those obtained with a previous 1/3D single-phase model getting more realistic temperature patterns, also allowing visualize low values of void fraction inside the upper plenum. It must be mentioned that the current results were obtained by imposing prescribed fission power functions from literature. Therefore, results are showed with the aim of point out the potentiality of the developed model.Keywords: PHWR, CFD, thermo-hydraulic, two-phase flow
Procedia PDF Downloads 46826761 Elastic Stress Analysis of Annular Bi-Material Discs with Variable Thickness under Mechanical and Thermomechanical Loads
Authors: Erhan Çetin, Ali Kurşun, Şafak Aksoy, Merve Tunay Çetin
Abstract:
The closed form study deal with elastic stress analysis of annular bi-material discs with variable thickness subjected to the mechanical and termomechanical loads. Those discs have many applications in the aerospace industry, such as gas turbines and gears. Those discs normally work under thermal and mechanical loads. Their life cycle can increase when stress components are minimized. Each material property is assumed to be isotropic. The results show that material combinations and thickness profiles play an important role in determining the responses of bi-material discs and an optimal design of those structures. Stress distribution is investigated and results are shown as graphs.Keywords: bi-material discs, elastic stress analysis, mechanical loads, rotating discs
Procedia PDF Downloads 32826760 The Community Structure of Fish and its Correlation with Mangrove Forest Litter Production in Panjang Island, Banten Bay, Indonesia
Authors: Meilisha Putri Pertiwi, Mufti Petala Patria
Abstract:
Mangrove forest often categorized as a productive ecosystem in trophic water and the highest carbon storage among all the forest types. Mangrove-derived organic matter determines the food web of fish and invertebrates. In Indonesia trophic water ecosystem, 80% commersial fish caught in coastal area are high related to food web in mangrove forest ecosystem. Based on the previous research in Panjang Island, Bojonegara, Banten, Indonesia, removed mangrove litterfall to the sea water were 9,023 g/m³/s for two stations (west station–5,169 g/m³/s and north station-3,854 g/m³/s). The vegetation were dominated from Rhizophora apiculata and Rhizopora stylosa. C element is the highest content (27,303% and 30,373%) than N element (0,427% and 0,35%) and P element (0,19% and 0,143%). The aim of research also to know the diversity of fish inhabit in mangrove forest. Fish sampling is by push net. Fish caught are collected into plastics, total length measured, weigh measured, and individual and total counted. Meanwhile, the 3 modified pipes (1 m long, 5 inches diameter, and a closed one hole part facing the river by using a nylon cloth) set in the water channel connecting mangrove forest and sea water for each stasiun. They placed for 1 hour at low tide. Then calculate the speed of water flow and volume of modified pipes. The fish and mangrove litter will be weigh for wet weight, dry weight, and analyze the C, N, and P element content. The sampling data will be conduct 3 times of month in full moon. The salinity, temperature, turbidity, pH, DO, and the sediment of mangrove forest will be measure too. This research will give information about the fish diversity in mangrove forest, the removed mangrove litterfall to the sea water, the composition of sediment, the total element content (C, N, P) of fish and mangrove litter, and the correlation of element content absorption between fish and mangrove litter. The data will be use for the fish and mangrove ecosystem conservation.Keywords: fish diversity, mangrove forest, mangrove litter, carbon element, nitrogen element, P element, conservation
Procedia PDF Downloads 48526759 Reliability-Based Method for Assessing Liquefaction Potential of Soils
Authors: Mehran Naghizaderokni, Asscar Janalizadechobbasty
Abstract:
This paper explores probabilistic method for assessing the liquefaction potential of sandy soils. The current simplified methods for assessing soil liquefaction potential use a deterministic safety factor in order to determine whether liquefaction will occur or not. However, these methods are unable to determine the liquefaction probability related to a safety factor. A solution to this problem can be found by reliability analysis.This paper presents a reliability analysis method based on the popular certain liquefaction analysis method. The proposed probabilistic method is formulated based on the results of reliability analyses of 190 field records and observations of soil performance against liquefaction. The results of the present study show that confidence coefficient greater and smaller than 1 does not mean safety and/or liquefaction in cadence for liquefaction, and for assuring liquefaction probability, reliability based method analysis should be used. This reliability method uses the empirical acceleration attenuation law in the Chalos area to derive the probability density distribution function and the statistics for the earthquake-induced cyclic shear stress ratio (CSR). The CSR and CRR statistics are used in continuity with the first order and second moment method to calculate the relation between the liquefaction probability, the safety factor and the reliability index. Based on the proposed method, the liquefaction probability related to a safety factor can be easily calculated. The influence of some of the soil parameters on the liquefaction probability can be quantitatively evaluated.Keywords: liquefaction, reliability analysis, chalos area, civil and structural engineering
Procedia PDF Downloads 47026758 Computational Fluid Dynamics Analysis of a Biomass Burner Gas Chamber in OpenFOAM
Authors: Óscar Alfonso Gómez Sepúlveda, Julián Ernesto Jaramillo, Diego Camilo Durán
Abstract:
The global climate crisis has affected different aspects of human life, and in an effort to reverse the effects generated, we seek to optimize and improve the equipment and plants that produce high emissions of CO₂, being possible to achieve this through numerical simulations. These equipments include biomass combustion chambers. The objective of this research is to visualize the thermal behavior of a gas chamber that is used in the process of obtaining vegetable extracts. The simulation is carried out with OpenFOAM taking into account the conservation of energy, turbulence, and radiation; for the purposes of the simulation, combustion is omitted and replaced by heat generation. Within the results, the streamlines generated by the primary and secondary flows are analyzed in order to visualize whether they generate the expected effect, and the energy is used to the maximum. The inclusion of radiation seeks to compare its influence and also simplify the computational times to perform mesh analysis. An analysis is carried out with simplified geometries and with experimental data to corroborate the selection of the models to be used, and it is obtained that for turbulence, the appropriate one is the standard k - w. As a means of verification, a general energy balance is made and compared with the results of the numerical analysis, where the error is 1.67%, which is considered acceptable. From the approach to improvement options, it was found that with the implementation of fins, heat can be increased by up to 7.3%.Keywords: CFD analysis, biomass, heat transfer, radiation, OpenFOAM
Procedia PDF Downloads 118