Search results for: stable isotope analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 28439

Search results for: stable isotope analysis

26369 The Moderation Effect of Critical Item on the Strategic Purchasing: Quality Performance Relationship

Authors: Kwong Yeung

Abstract:

Theories about strategic purchasing and quality performance are underdeveloped. Understanding the evolving role of purchasing from reactive to proactive is a pressing strategic issue. Using survey responses from 176 manufacturing and electronics industry professionals, we study the relationships between strategic purchasing and supply chain partners’ quality performance to answer the following questions: Can transaction cost economics be used to elucidate the strategic purchasing-quality performance relationship? Is this strategic purchasing-quality performance relationship moderated by critical item analysis? The findings indicate that critical item analysis positively and significantly moderates the strategic purchasing-quality performance relationship.

Keywords: critical item analysis, moderation, quality performance, strategic purchasing, transaction cost economics

Procedia PDF Downloads 549
26368 Rosuvastatin Improves Endothelial Progenitor Cells in Rheumatoid Arthritis

Authors: Ashit Syngle, Nidhi Garg, Pawan Krishan

Abstract:

Background: Endothelial Progenitor Cells (EPCs) are depleted and contribute to increased cardiovascular (CV) risk in rheumatoid arthritis (RA). Statins exert a protective effect in CAD partly by promoting EPC mobilization. This vasculoprotective effect of statin has not yet been investigated in RA. We aimed to investigate the effect of rosuvastatin on EPCs in RA. Methods: 50 RA patients were randomized to receive 6 months of treatment with rosuvastatin (10 mg/day, n=25) and placebo (n=25) as an adjunct to existing stable antirheumatic drugs. EPCs (CD34+/CD133+) were quantified by Flow Cytometry. Inflammatory measures included DAS28, CRP and ESR were measured at baseline and after treatment. Lipids and pro-inflammatory cytokines (TNF-α, IL-6, and IL-1) were estimated at baseline and after treatment. Results: At baseline, inflammatory measures and pro-inflammatory cytokines were elevated and EPCs depleted among both groups. At baseline, EPCs inversely correlated with DAS28 and TNF-α in both groups. EPCs increased significantly (p < 0.01) after treatment with rosuvastatin but did not show significant change with placebo. Rosuvastatin exerted positive effect on lipid spectrum: lowering total cholesterol, LDL, non HDL and elevation of HDL as compared with placebo. At 6 months, DAS28, ESR, CRP, TNF-α and IL-6 improved significantly in rosuvastatin group. Significant negative correlation was observed between EPCs and DAS28, CRP, TNF-α, and IL-6 after treatment with rosuvastatin. Conclusion: First study to show that rosuvastatin improves inflammation and EPC biology in RA possibly through its anti-inflammatory and lipid lowering effect. This beneficial effect of rosuvastatin may provide a novel strategy to prevent cardiovascular events in RA.

Keywords: RA, Endothelial Progenitor Cells, rosuvastatin, cytokines

Procedia PDF Downloads 246
26367 Coordinated Voltage Control in a Radial Distribution System

Authors: Shivarudraswamy, Anubhav Shrivastava, Lakshya Bhat

Abstract:

Distributed generation has indeed become a major area of interest in recent years. Distributed Generation can address large number of loads in a power line and hence has better efficiency over the conventional methods. However there are certain drawbacks associated with it, increase in voltage being the major one. This paper addresses the voltage control at the buses for an IEEE 30 bus system by regulating reactive power. For carrying out the analysis, the suitable location for placing distributed generators (DG) is identified through load flow analysis and seeing where the voltage profile is dipping. MATLAB programming is used to regulate the voltage at all buses within +/-5% of the base value even after the introduction of DG’s. Three methods for regulation of voltage are discussed. A sensitivity based analysis is later carried out to determine the priority among the various methods listed in the paper.

Keywords: distributed generators, distributed system, reactive power, voltage control

Procedia PDF Downloads 481
26366 An Analysis of Discourse Markers Awareness in Writing Undergraduate Thesis of English Education Student in Sebelas Maret University

Authors: Oktanika Wahyu Nurjanah, Anggun Fitriana Dewi

Abstract:

An undergraduate thesis is one of the academic writings which should fulfill some characteristics, one of them is coherency. Moreover, a coherence of a text depends on the usage of discourse markers. In other word, discourse markers take an essential role in writing. Therefore, the researchers aim to know the awareness of the discourse markers usage in writing the under-graduate thesis of an English Education student at Sebelas Maret University. This research uses a qualitative case study in order to obtain a deep analysis. The sample of this research is an under-graduate thesis of English Education student in Sebelas Maret University which chosen based on some criteria. Additionally, the researchers were guided by some literature attempted to group the discourse markers based on their functions. Afterward, the analysis was held based on it. From the analysis, it found that the awareness of discourse markers usage is moderate. The last point, the researcher suggest undergraduate students to familiarize themselves with discourse markers, especially for those who want to write thesis.

Keywords: discourse markers, English education, thesis writing, undergraduate student

Procedia PDF Downloads 342
26365 Resource Constrained Time-Cost Trade-Off Analysis in Construction Project Planning and Control

Authors: Sangwon Han, Chengquan Jin

Abstract:

Time-cost trade-off (TCTO) is one of the most significant part of construction project management. Despite the significance, current TCTO analysis, based on the Critical Path Method, does not consider resource constraint, and accordingly sometimes generates an impractical and/or infeasible schedule planning in terms of resource availability. Therefore, resource constraint needs to be considered when doing TCTO analysis. In this research, genetic algorithms (GA) based optimization model is created in order to find the optimal schedule. This model is utilized to compare four distinct scenarios (i.e., 1) initial CPM, 2) TCTO without considering resource constraint, 3) resource allocation after TCTO, and 4) TCTO with considering resource constraint) in terms of duration, cost, and resource utilization. The comparison results identify that ‘TCTO with considering resource constraint’ generates the optimal schedule with the respect of duration, cost, and resource. This verifies the need for consideration of resource constraint when doing TCTO analysis. It is expected that the proposed model will produce more feasible and optimal schedule.

Keywords: time-cost trade-off, genetic algorithms, critical path, resource availability

Procedia PDF Downloads 167
26364 Sheathed Cotton Fibers: Material for Oil-Spill Cleanup

Authors: Benjamin M Dauda, Esther Ibrahim, Sylvester Gadimoh, Asabe Mustapha, Jiyah Mohammed

Abstract:

Despite diverse optimization techniques on natural hydrophilic fibers, hydrophobic synthetic fibers are still the best oil sorption materials. However, these hydrophobic fibers are not biodegradable, making their disposal problematic. To this end, this work sets out to develop Nonwoven sorbents from epoxy-coated Cotton fibers. As a way of improving the compatibility of the crude oil and reduction of moisture absorption, cotton fibers were coated with epoxy resin by immersion in acetone-thinned epoxy solution. A needle-punching machine was used to convert the fibers into coherent nonwoven sheets. An oil sorption experiment was then carried out. The result indicates that the developed epoxy-modified sorbent has a higher crude oil-sorption capacity compared with those of untreated cotton and commercial polypropylene sorbents. Absorption Curves show that the coated fiber and polypropylene sorbent saturated faster than the uncoated cotton fiber pad. The result also shows that the coated cotton sorbent adsorbed crude faster than the polypropylene sorbent, and the equilibrium exhaustion was also higher. After a simple mechanical squeezing process, the Nonwoven pads could be restored to their original form and repeatedly recycled for oil/water separation. The results indicate that the cotton-coated non-woven pads hold promise for the cleanup of oil spills. Our data suggests that the sorption behaviors of the epoxy-coated Nonwoven pads and their crude oil sorption capacity are relatively stable under various environmental conditions compared to the commercial sheet.

Keywords: oil spill, adsorption, cotton, epoxy, nonwoven

Procedia PDF Downloads 34
26363 Longitudinal Analysis of Internet Speed Data in the Gulf Cooperation Council Region

Authors: Musab Isah

Abstract:

This paper presents a longitudinal analysis of Internet speed data in the Gulf Cooperation Council (GCC) region, focusing on the most populous cities of each of the six countries – Riyadh, Saudi Arabia; Dubai, UAE; Kuwait City, Kuwait; Doha, Qatar; Manama, Bahrain; and Muscat, Oman. The study utilizes data collected from the Measurement Lab (M-Lab) infrastructure over a five-year period from January 1, 2019, to December 31, 2023. The analysis includes downstream and upstream throughput data for the cities, covering significant events such as the launch of 5G networks in 2019, COVID-19-induced lockdowns in 2020 and 2021, and the subsequent recovery period and return to normalcy. The results showcase substantial increases in Internet speeds across the cities, highlighting improvements in both download and upload throughput over the years. All the GCC countries have achieved above-average Internet speeds that can conveniently support various online activities and applications with excellent user experience.

Keywords: internet data science, internet performance measurement, throughput analysis, internet speed, measurement lab, network diagnostic tool

Procedia PDF Downloads 35
26362 Integrating Data Envelopment Analysis and Variance Inflation Factor to Measure the Efficiency of Decision Making Units

Authors: Mostafa Kazemi, Zahra N. Farkhani

Abstract:

This paper proposes an integrated Data Envelopment Analysis (DEA) and Variance Inflation Factor (VIF) model for measuring the technical efficiency of decision making units. The model is validated using a set of 69% sales representatives’ dairy products. The analysis is done in two stages, in the first stage, VIF technique is used to distinguish independent effective factors of resellers, and in the second stage we used DEA for measuring efficiency for both constant and variable return to scales status. Further DEA is used to examine the utilization of environmental factors on efficiency. Results of this paper indicated an average managerial efficiency of 83% in the whole sales representatives’ dairy products. In addition, technical and scale efficiency were counted 96% and 80% respectively. 38% of sales representative have the technical efficiency of 100% and 72% of the sales representative in terms of managerial efficiency are quite efficient.High levels of relative efficiency indicate a good condition for sales representative efficiency.

Keywords: data envelopment analysis (DEA), relative efficiency, sales representatives’ dairy products, variance inflation factor (VIF)

Procedia PDF Downloads 542
26361 Novel Emulgel of Piroxicam for Topical Application with Mentha and Clove Oil

Authors: S. V. Patil, P. S. Dounde, S. S. Patil

Abstract:

Emulgels have emerged as one of the most interesting topical delivery system as it has dual release control system that is gel and emulsion. The major objective behind this formulation is delivery of hydrophobic drugs to systemic circulation via skin. In fact presence of a gelling agent in water phase converts a classical emulsion in to emulgel. The emulgel for dermatological use has several favorable properties such as being thixotropic, greaseless, easily spreadable, easily removable, emollient, non-staining, water-soluble, longer shelf life, bio-friendly, transparent and pleasing appearance. Various penetration enhancers can potentiate the effect. So this can be used as better topical drug delivery systems over present conventional systems available in market. Piroxicam is a non-steroidal anti-inflammatory drug that has major problems when administered orally; it is an insoluble drug and has irritant effect on gastro intestinal tract lead to ulceration and bleeding. The aim of this study was to overcoming these problems through preparation of topical emulgel of this drug. Emulgel of Piroxicam was prepared using Carbopol 940 along with mentha oil and clove oil as permeation enhancer. The prepared emulgel were evaluated for their physical appearance, pH determination, viscosity, spreadability, in vitro drug release, ex vivo permeation studies. All the prepared formulations showed acceptable physical properties, homogeneity, consistency, spreadability, viscosity and pH value. The emulgel was found to be stable with respect to physical appearance, pH, rheological properties and drug content at all temperature and conditions for three month.

Keywords: emulgel, piroxicam, menthe oil, clove oil

Procedia PDF Downloads 444
26360 Effects of Small Amount of Poly(D-Lactic Acid) on the Properties of Poly(L-Lactic Acid)/Microcrystalline Cellulose/Poly(D-Lactic Acid) Blends

Authors: Md. Hafezur Rahaman, Md. Sagor Hosen, Md. Abdul Gafur, Rasel Habib

Abstract:

This research is a systematic study of effects of poly(D-lactic acid) (PDLA) on the properties of poly(L-lactic acid)(PLLA)/microcrystalline cellulose (MCC)/PDLA blends by stereo complex crystallization. Blends were prepared with constant percentage of (3 percent) MCC and different percentage of PDLA by solution casting methods. These blends were characterized by Fourier Transform Infrared Spectroscopy (FTIR) for the confirmation of blends compatibility, Wide-Angle X-ray Scattering (WAXS) and scanning electron microscope (SEM) for the analysis of morphology, thermo-gravimetric analysis (TGA) and differential thermal analysis (DTA) for thermal properties measurement. FTIR Analysis results confirm no new characteristic absorption peaks appeared in the spectrum instead shifting of peaks due to hydrogen bonding help to have compatibility of blends component. Development of three new peaks from XRD analysis indicates strongly the formation of stereo complex crystallinity in the PLLA structure with the addition of PDLA. TGA and DTG results indicate that PDLA can improve the heat resistivity of the PLLA/MCC blends by increasing its degradation temperature. Comparison of DTA peaks also ensure developed thermal properties. Image of SEM shows the improvement of surface morphology.

Keywords: microcrystalline cellulose, poly(l-lactic acid), stereocomplex crystallization, thermal stability

Procedia PDF Downloads 121
26359 Identify the Renewable Energy Potential through Sustainability Indicators and Multicriteria Analysis

Authors: Camila Lima, Murilo Andrade Valle, Patrícia Teixeira Leite Asano

Abstract:

The growth in demand for electricity, caused by human development, depletion and environmental impacts caused by traditional sources of electricity generation have made new energy sources are increasingly encouraged and necessary for companies in the electricity sector. Based on this scenario, this paper assesses the negative environmental impacts associated with thermoelectric power plants in Brazil, pointing out the importance of using renewable energy sources, reducing environmental aggression. This article points out the existence of an energy alternative, wind energy, of the municipalities of São Paulo, represented by georeferenced maps with the help of GIS, using as a premise the indicators of sustainability and multicriteria analysis in the decision-making process.

Keywords: GIS (geographic information systems), multicriteria analysis, sustainability, wind energy

Procedia PDF Downloads 347
26358 The Non-Linear Analysis of Brain Response to Visual Stimuli

Authors: H. Namazi, H. T. N. Kuan

Abstract:

Brain activity can be measured by acquiring and analyzing EEG signals from an individual. In fact, the human brain response to external and internal stimuli is mapped in his EEG signals. During years some methods such as Fourier transform, wavelet transform, empirical mode decomposition, etc. have been used to analyze the EEG signals in order to find the effect of stimuli, especially external stimuli. But each of these methods has some weak points in analysis of EEG signals. For instance, Fourier transform and wavelet transform methods are linear signal analysis methods which are not good to be used for analysis of EEG signals as nonlinear signals. In this research we analyze the brain response to visual stimuli by extracting information in the form of various measures from EEG signals using a software developed by our research group. The used measures are Jeffrey’s measure, Fractal dimension and Hurst exponent. The results of these analyses are useful not only for fundamental understanding of brain response to visual stimuli but provide us with very good recommendations for clinical purposes.

Keywords: visual stimuli, brain response, EEG signal, fractal dimension, hurst exponent, Jeffrey’s measure

Procedia PDF Downloads 541
26357 Transdermal Therapeutic System of Lercanıdipine Hydrochloride: Fabrication and in Vivo Evaluation

Authors: Jiji Jose, R. Narayanacharyulu, Molly Mathew, Jisha Prems

Abstract:

Introduction: Lercanidipine hydrochloride (LD), an effective calcium channel blocker, widely used for the treatment of chronic stable angina and hypertension seems to be potential transdermal therapeutic system candidate, mainly due to its low oral bio availability, short half life and high first-pass metabolism. Objective: To develop transdermal therapeutic systems for LD and to evaluate its in vivo performance in rabbits. Methodology: Transdermal patches of LD were formulated using the polymer blend of eudragit RL100 (ERL) and polyvinyl pyrolidone (PVP) by casting method Propylene glycol (PG) and tween 80 were used as plasticizer and permeation enhancer respectively. The pharmaco kinetic parameters of LD after the administration of transdermal patches was compared with that of oral administration. The study was carried out in a two way crossover design in male New Zealand albino rabbits. Results: The formulation with ERL: PVP ratio 1:4 with 15% w/w PG as plasticizer and 4% w/w tween 80 as permeation enhancer showed the best drug release results. The pharmacokinetic parameters such as Cmax, tmax, mean residence time (MRT) and area under the curve (AUC 0-∞) were significantly different following transdermal administration compared to oral administration. The terminal half life of transdermally administered LD was found to similar that of oral administration. A sustained drug release over a period of 24 hrs was observed after transdermal administration. Conclusion: The fabricated transdermal delivery system have the potential to provide controlled and extended drug release, better bio availability and thus, this may improve the patient compliance.

Keywords: transdermal therapeutic system, lercanidipine hydrochloride, eudragit, skinpermeation

Procedia PDF Downloads 602
26356 PM Air Quality of Windsor Regional Scale Transport’s Impact and Climate Change

Authors: Moustafa Osman Mohammed

Abstract:

This paper is mapping air quality model to engineering the industrial system that ultimately utilized in extensive range of energy systems, distribution resources, and end-user technologies. The model is determining long-range transport patterns contribution as area source can either traced from 48 hrs backward trajectory model or remotely described from background measurements data in those days. The trajectory model will be run within stable conditions and quite constant parameters of the atmospheric pressure at the most time of the year. Air parcel trajectory is necessary for estimating the long-range transport of pollutants and other chemical species. It provides a better understanding of airflow patterns. Since a large amount of meteorological data and a great number of calculations are required to drive trajectory, it will be very useful to apply HYPSLIT model to locate areas and boundaries influence air quality at regional location of Windsor. 2–days backward trajectories model at high and low concentration measurements below and upward the benchmark which was areas influence air quality measurement levels. The benchmark level will be considered as 30 (μg/m3) as the moderate level for Ontario region. Thereby, air quality model is incorporating a midpoint concept between biotic and abiotic components to broaden the scope of quantification impact. The later outcomes’ theories of environmental obligation suggest either a recommendation or a decision of what is a legislative should be achieved in mitigation measures of air emission impact ultimately.

Keywords: air quality, management systems, environmental impact assessment, industrial ecology, climate change

Procedia PDF Downloads 228
26355 Static and Dynamical Analysis on Clutch Discs on Different Material and Geometries

Authors: Jairo Aparecido Martins, Estaner Claro Romão

Abstract:

This paper presents the static and cyclic stresses in combination with fatigue analysis resultant of loads applied on the friction discs usually utilized on industrial clutches. The material chosen to simulate the friction discs under load is aluminum. The numerical simulation was done by software COMSOLTM Multiphysics. The results obtained for static loads showed enough stiffness for both geometries and the material utilized. On the other hand, in the fatigue standpoint, failure is clearly verified, what demonstrates the importance of both approaches, mainly dynamical analysis. The results and the conclusion are based on the stresses on disc, counted stress cycles, and fatigue usage factor.

Keywords: aluminum, industrial clutch, static and dynamic loading, numerical simulation

Procedia PDF Downloads 175
26354 Revealing of the Wave-Like Process in Kinetics of the Structural Steel Radiation Degradation

Authors: E. A. Krasikov

Abstract:

Dependence of the materials properties on neutron irradiation intensity (flux) is a key problem while usage data of the accelerated materials irradiation in test reactors for forecasting of their capacity for work in realistic (practical) circumstances of operation. Investigations of the reactor pressure vessel steel radiation degradation dependence on fast neutron fluence (embrittlement kinetics) at low flux reveal the instability in the form of the scatter of the experimental data and wave-like sections of embrittlement kinetics appearance. Disclosure of the steel degradation oscillating is a sign of the steel structure cyclic self-recovery transformation as it take place in self-organization processes. This assumption has received support through the discovery of the similar ‘anomalous’ data in scientific publications and by means of own additional experiments. Data obtained stimulate looking-for ways to management of the structural steel radiation stability (for example, by means of nano - structure modification for radiation defects annihilation intensification) for creation of the intelligent self-recovering material. Expected results: - radiation degradation theory and mechanisms development, - more adequate models of the radiation embrittlement elaboration, - surveillance specimen programs improvement, - methods and facility development for usage data of the accelerated materials irradiation for forecasting of their capacity for work in realistic (practical) circumstances of operation, - search of the ways for creating of the radiation stable self-recovery intelligent materials.

Keywords: degradation, radiation, steel, wave-like kinetics

Procedia PDF Downloads 292
26353 Study of Mobile Game Addiction Using Electroencephalography Data Analysis

Authors: Arsalan Ansari, Muhammad Dawood Idrees, Maria Hafeez

Abstract:

Use of mobile phones has been increasing considerably over the past decade. Currently, it is one of the main sources of communication and information. Initially, mobile phones were limited to calls and messages, but with the advent of new technology smart phones were being used for many other purposes including video games. Despite of positive outcomes, addiction to video games on mobile phone has become a leading cause of psychological and physiological problems among many people. Several researchers examined the different aspects of behavior addiction with the use of different scales. Objective of this study is to examine any distinction between mobile game addicted and non-addicted players with the use of electroencephalography (EEG), based upon psycho-physiological indicators. The mobile players were asked to play a mobile game and EEG signals were recorded by BIOPAC equipment with AcqKnowledge as data acquisition software. Electrodes were places, following the 10-20 system. EEG was recorded at sampling rate of 200 samples/sec (12,000samples/min). EEG recordings were obtained from the frontal (Fp1, Fp2), parietal (P3, P4), and occipital (O1, O2) lobes of the brain. The frontal lobe is associated with behavioral control, personality, and emotions. The parietal lobe is involved in perception, understanding logic, and arithmetic. The occipital lobe plays a role in visual tasks. For this study, a 60 second time window was chosen for analysis. Preliminary analysis of the signals was carried out with Acqknowledge software of BIOPAC Systems. From the survey based on CGS manual study 2010, it was concluded that five participants out of fifteen were in addictive category. This was used as prior information to group the addicted and non-addicted by physiological analysis. Statistical analysis showed that by applying clustering analysis technique authors were able to categorize the addicted and non-addicted players specifically on theta frequency range of occipital area.

Keywords: mobile game, addiction, psycho-physiology, EEG analysis

Procedia PDF Downloads 149
26352 Corrosion Behvaior of CS1018 in Various CO2 Capture Solvents

Authors: Aida Rafat, Ramazan Kahraman, Mert Atilhan

Abstract:

The aggressive corrosion behavior of conventional amine solvents is one of main barriers against large scale commerizaliation of amine absorption process for carbon capture application. Novel CO2 absorbents that exhibit minimal corrosivity against operation conditions are essential to lower corrosion damage and control and ensure more robustness in the capture plant. This work investigated corrosion behavior of carbon steel CS1018 in various CO2 absrobent solvents. The tested solvents included the classical amines MEA, DEA and MDEA, piperazine activated solvents MEA/PZ, MDEA/PZ and MEA/MDEA/PZ as well as mixtures of MEA and Room Temperature Ionic Liquids RTIL, namely MEA/[C4MIM][BF4] and MEA/[C4MIM][Otf]. Electrochemical polarization technique was used to determine the system corrosiveness in terms of corrosion rate and polarization behavior. The process parameters of interest were CO2 loading and solution temperature. Electrochemical resulted showed corrosivity order of classical amines at 40°C is MDEA> MEA > DEA wherase at 80°C corrosivity ranking changes to MEA > DEA > MDEA. Corrosivity rankings were mainly governed by CO2 absorption capacity at the test temperature. Corrosivity ranking for activated amines at 80°C was MEA/PZ > MDEA/PZ > MEA/MDEA/PZ. Piperazine addition seemed to have a dual advanatge in terms of enhancing CO2 absorption capacity as well as nullifying corrosion. For MEA/RTIL mixtures, the preliminary results showed that the partial repalcement of aqueous phase in MEA solution by the more stable nonvolatile RTIL solvents reduced corrosion rates considerably.

Keywords: corrosion, amines, CO2 capture, piperazine, ionic liquids

Procedia PDF Downloads 446
26351 The Relationship between Sleep Traits and Tinnitus in UK Biobank: A Population-Based Cohort Study

Authors: Jiajia Peng, Yijun Dong, Jianjun Ren, Yu Zhao

Abstract:

Objectives: Understanding the association between sleep traits and tinnitus could help prevent and provide appropriate interventions against tinnitus. Therefore, this study aimed to assess the relationship between different sleep patterns and tinnitus. Design: A cross-sectional analysis using baseline data (2006–2010, n=168,064) by logistic regressions was conducted to evaluate the association between sleep traits (including the overall health sleep score and five sleep behaviors), and the occurrence (yes/no), frequency (constant/transient), and severity (upsetting/not upsetting) of tinnitus. Further, a prospective analysis of participants without tinnitus at baseline (n=9,581) was performed, who had been followed up for seven years (2012–2019) to assess the association between new-onset tinnitus and sleep characteristics. Moreover, a subgroup analysis was also carried out to estimate the differences in sex by dividing the participants into male and female groups. A sensitivity analysis was also conducted by excluding ear-related diseases to avoid their confounding effects on tinnitus (n=102,159). Results: In the cross-sectional analysis, participants with “current tinnitus” (OR: 1.13, 95% CI: 1.04–1.22, p=0.004) had a higher risk of having a poor overall healthy sleep score and unhealthy sleep behaviors such as short sleep durations (OR: 1.09, 95% CI: 1.04–1.14, p<0.001), late chronotypes (OR: 1.09, 95% CI: 1.05–1.13, p<0.001), and sleeplessness (OR: 1.16, 95% CI: 1.11–1.22, p<0.001) than those participants who “did not have current tinnitus.” However, this trend was not obvious between “constant tinnitus” and “transient tinnitus.” When considering the severity of tinnitus, the risk of “upsetting tinnitus” was obviously higher if participants had lower overall healthy sleep scores (OR: 1.31, 95% CI: 1.13–1.53, p<0.001). Additionally, short sleep duration (OR: 1.22, 95% CI: 1.12–1.33, p<0.001), late chronotypes (OR: 1.13, 95% CI: 1.04–1.22, p=0.003), and sleeplessness (OR: 1.43, 95% CI: 1.29–1.59, p<0.001) showed positive correlations with “upsetting tinnitus.” In the prospective analysis, sleeplessness presented a consistently significant association with “upsetting tinnitus” (RR: 2.28, P=0.001). Consistent results were observed in the sex subgroup analysis, where a much more pronounced trend was identified in females compared with males. The results of the sensitivity analysis were consistent with those of the cross-sectional and prospective analyses. Conclusions: Different types of sleep disturbance may be associated with the occurrence and severity of tinnitus; therefore, precise interventions for different types of sleep disturbance, particularly sleeplessness, may help in the prevention and treatment of tinnitus.

Keywords: tinnitus, sleep, sleep behaviors, sleep disturbance

Procedia PDF Downloads 119
26350 Risks beyond Cyber in IoT Infrastructure and Services

Authors: Mattias Bergstrom

Abstract:

Significance of the Study: This research will provide new insights into the risks with digital embedded infrastructure. Through this research, we will analyze each risk and its potential negation strategies, especially for AI and autonomous automation. Moreover, the analysis that is presented in this paper will convey valuable information for future research that can create more stable, secure, and efficient autonomous systems. To learn and understand the risks, a large IoT system was envisioned, and risks with hardware, tampering, and cyberattacks were collected, researched, and evaluated to create a comprehensive understanding of the potential risks. Potential solutions have then been evaluated on an open source IoT hardware setup. This list shows the identified passive and active risks evaluated in the research. Passive Risks: (1) Hardware failures- Critical Systems relying on high rate data and data quality are growing; SCADA systems for infrastructure are good examples of such systems. (2) Hardware delivers erroneous data- Sensors break, and when they do so, they don’t always go silent; they can keep going, just that the data they deliver is garbage, and if that data is not filtered out, it becomes disruptive noise in the system. (3) Bad Hardware injection- Erroneous generated sensor data can be pumped into a system by malicious actors with the intent to create disruptive noise in critical systems. (4) Data gravity- The weight of the data collected will affect Data-Mobility. (5) Cost inhibitors- Running services that need huge centralized computing is cost inhibiting. Large complex AI can be extremely expensive to run. Active Risks: Denial of Service- It is one of the most simple attacks, where an attacker just overloads the system with bogus requests so that valid requests disappear in the noise. Malware- Malware can be anything from simple viruses to complex botnets created with specific goals, where the creator is stealing computer power and bandwidth from you to attack someone else. Ransomware- It is a kind of malware, but it is so different in its implementation that it is worth its own mention. The goal with these pieces of software is to encrypt your system so that it can only be unlocked with a key that is held for ransom. DNS spoofing- By spoofing DNS calls, valid requests and data dumps can be sent to bad destinations, where the data can be extracted for extortion or to corrupt and re-inject into a running system creating a data echo noise loop. After testing multiple potential solutions. We found that the most prominent solution to these risks was to use a Peer 2 Peer consensus algorithm over a blockchain to validate the data and behavior of the devices (sensors, storage, and computing) in the system. By the devices autonomously policing themselves for deviant behavior, all risks listed above can be negated. In conclusion, an Internet middleware that provides these features would be an easy and secure solution to any future autonomous IoT deployments. As it provides separation from the open Internet, at the same time, it is accessible over the blockchain keys.

Keywords: IoT, security, infrastructure, SCADA, blockchain, AI

Procedia PDF Downloads 92
26349 Post Pandemic Mobility Analysis through Indexing and Sharding in MongoDB: Performance Optimization and Insights

Authors: Karan Vishavjit, Aakash Lakra, Shafaq Khan

Abstract:

The COVID-19 pandemic has pushed healthcare professionals to use big data analytics as a vital tool for tracking and evaluating the effects of contagious viruses. To effectively analyze huge datasets, efficient NoSQL databases are needed. The analysis of post-COVID-19 health and well-being outcomes and the evaluation of the effectiveness of government efforts during the pandemic is made possible by this research’s integration of several datasets, which cuts down on query processing time and creates predictive visual artifacts. We recommend applying sharding and indexing technologies to improve query effectiveness and scalability as the dataset expands. Effective data retrieval and analysis are made possible by spreading the datasets into a sharded database and doing indexing on individual shards. Analysis of connections between governmental activities, poverty levels, and post-pandemic well being is the key goal. We want to evaluate the effectiveness of governmental initiatives to improve health and lower poverty levels. We will do this by utilising advanced data analysis and visualisations. The findings provide relevant data that supports the advancement of UN sustainable objectives, future pandemic preparation, and evidence-based decision-making. This study shows how Big Data and NoSQL databases may be used to address problems with global health.

Keywords: big data, COVID-19, health, indexing, NoSQL, sharding, scalability, well being

Procedia PDF Downloads 53
26348 Ergonomical Study of Hand-Arm Vibrational Exposure in a Gear Manufacturing Plant in India

Authors: Santosh Kumar, M. Muralidhar

Abstract:

The term ‘ergonomics’ is derived from two Greek words: ‘ergon’, meaning work and ‘nomoi’, meaning natural laws. Ergonomics is the study of how working conditions, machines and equipment can be arranged in order that people can work with them more efficiently. In this research communication an attempt has been made to study the effect of hand-arm vibrational exposure on the workers of a gear manufacturing plant by comparison of potential Carpal Tunnel Syndrome (CTS) symptoms and effect of different exposure levels of vibration on occurrence of CTS in actual industrial environment. Chi square test and correlation analysis have been considered for statistical analysis. From Chi square test, it has been found that the potential CTS symptoms occurrence is significantly dependent on the level of vibrational exposure. Data analysis indicates that 40.51% workers having potential CTS symptoms are exposed to vibration. Correlation analysis reveals that potential CTS symptoms are significantly correlated with exposure to level of vibration from handheld tools and to repetitive wrist movements.

Keywords: CTS symptoms, hand-arm vibration, ergonomics, physical tests

Procedia PDF Downloads 359
26347 Holomorphic Prioritization of Sets within Decagram of Strategic Decision Making of POSM Using Operational Research (OR): Analytic Hierarchy Process (AHP) Analysis

Authors: Elias Ogutu Azariah Tembe, Hussain Abdullah Habib Al-Salamin

Abstract:

There is decagram of strategic decisions of operations and production/service management (POSM) within operational research (OR) which must collate, namely: design, inventory, quality, location, process and capacity, layout, scheduling, maintain ace, and supply chain. This paper presents an architectural configuration conceptual framework of a decagram of sets decisions in a form of mathematical complete graph and abelian graph. Mathematically, a complete graph is undirected (UDG), and directed (DG) a relationship where every pair of vertices are connected, collated, confluent, and holomorphic. There has not been any study conducted which, however, prioritizes the holomorphic sets which of POMS within OR field of study. The study utilizes OR structured technique known as The Analytic Hierarchy Process (AHP) analysis for organizing, sorting and prioritizing (ranking) the sets within the decagram of POMS according to their attribution (propensity), and provides an analysis how the prioritization has real-world application within the 21st century.

Keywords: holomorphic, decagram, decagon, confluent, complete graph, AHP analysis, SCM, HRM, OR, OM, abelian graph

Procedia PDF Downloads 390
26346 The Use of Random Set Method in Reliability Analysis of Deep Excavations

Authors: Arefeh Arabaninezhad, Ali Fakher

Abstract:

Since the deterministic analysis methods fail to take system uncertainties into account, probabilistic and non-probabilistic methods are suggested. Geotechnical analyses are used to determine the stress and deformation caused by construction; accordingly, many input variables which depend on ground behavior are required for geotechnical analyses. The Random Set approach is an applicable reliability analysis method when comprehensive sources of information are not available. Using Random Set method, with relatively small number of simulations compared to fully probabilistic methods, smooth extremes on system responses are obtained. Therefore random set approach has been proposed for reliability analysis in geotechnical problems. In the present study, the application of random set method in reliability analysis of deep excavations is investigated through three deep excavation projects which were monitored during the excavating process. A finite element code is utilized for numerical modeling. Two expected ranges, from different sources of information, are established for each input variable, and a specific probability assignment is defined for each range. To determine the most influential input variables and subsequently reducing the number of required finite element calculations, sensitivity analysis is carried out. Input data for finite element model are obtained by combining the upper and lower bounds of the input variables. The relevant probability share of each finite element calculation is determined considering the probability assigned to input variables present in these combinations. Horizontal displacement of the top point of excavation is considered as the main response of the system. The result of reliability analysis for each intended deep excavation is presented by constructing the Belief and Plausibility distribution function (i.e. lower and upper bounds) of system response obtained from deterministic finite element calculations. To evaluate the quality of input variables as well as applied reliability analysis method, the range of displacements extracted from models has been compared to the in situ measurements and good agreement is observed. The comparison also showed that Random Set Finite Element Method applies to estimate the horizontal displacement of the top point of deep excavation. Finally, the probability of failure or unsatisfactory performance of the system is evaluated by comparing the threshold displacement with reliability analysis results.

Keywords: deep excavation, random set finite element method, reliability analysis, uncertainty

Procedia PDF Downloads 254
26345 Deformation Analysis of Pneumatized Sphenoid Bone Caused Due to Elevated Intracranial Pressure Using Finite Element Analysis

Authors: Dilesh Mogre, Jitendra Toravi, Saurabh Joshi, Prutha Deshpande, Aishwarya Kura

Abstract:

In earlier days of technology, it was not possible to understand the nature of complex biomedical problems and were only left to clinical postulations. With advancement in science today, we have tools like Finite Element Modelling and simulation to solve complex biomedical problems. This paper presents how ANSYS WORKBENCH can be used to study deformation of pneumatized sphenoid bone caused by increased intracranial pressure. Intracranial pressure refers to the pressure inside the skull. The increase in the pressure above the normal range of 15mmhg can lead to serious conditions due to developed stresses and deformation. One of the areas where the deformation is suspected to occur is Sphenoid Bone. Moreover, the varying degree of pneumatization increases the complexity of the conditions. It is necessary to study deformation patterns on pneumatized sphenoid bone model at elevated intracranial pressure. Finite Element Analysis plays a major role in developing and analyzing model and give quantitative results.

Keywords: intracranial pressure, pneumatized sphenoid bone, deformation, finite element analysis

Procedia PDF Downloads 176
26344 Elastic Stress Analysis of Annular Bi-Material Discs with Variable Thickness under Mechanical and Thermomechanical Loads

Authors: Erhan Çetin, Ali Kurşun, Şafak Aksoy, Merve Tunay Çetin

Abstract:

The closed form study deal with elastic stress analysis of annular bi-material discs with variable thickness subjected to the mechanical and termomechanical loads. Those discs have many applications in the aerospace industry, such as gas turbines and gears. Those discs normally work under thermal and mechanical loads. Their life cycle can increase when stress components are minimized. Each material property is assumed to be isotropic. The results show that material combinations and thickness profiles play an important role in determining the responses of bi-material discs and an optimal design of those structures. Stress distribution is investigated and results are shown as graphs.

Keywords: bi-material discs, elastic stress analysis, mechanical loads, rotating discs

Procedia PDF Downloads 315
26343 Carbon Dioxide Hydrogenation to Methanol over Cu/ZnO-SBA-15 Catalyst: Effect of Metal Loading

Authors: S. F. H. Tasfy, N. A. M. Zabidi, M.-S. Shaharun

Abstract:

Utilization of CO2 as a carbon source to produce valuable chemicals is one of the important ways to reduce the global warming caused by increasing CO2 in the atmosphere. Supported metal catalysts are crucial for the production of clean and renewable fuels and chemicals from the stable CO2 molecules. The catalytic conversion of CO2 into methanol is recently under increased scrutiny as an opportunity to be used as a low-cost carbon source. Therefore, series of the bimetallic Cu/ZnO-based catalyst supported by SBA-15 were synthesized via impregnation technique with different total metal loading and tested in the catalytic hydrogenation of CO2 to methanol. The morphological and textural properties of the synthesized catalysts were determined by transmission electron microscopy (TEM), temperature programmed desorption, reduction, oxidation and pulse chemisorption (TPDRO), and N2-adsorption. The CO2 hydrogenation reaction was performed in microactivity fixed-bed system at 250 °C, 2.25 MPa, and H2/CO2 ratio of 3. Experimental results showed that the catalytic structure and performance was strongly affected by the loading of the active site. Where, the catalytic activity, methanol selectivity as well as the space-time yield increased with increasing the metal loading until it reaches the maximum values at a metal loading of 15 wt% while further addition of metal inhibits the catalytic performance. The higher catalytic activity of 14 % and methanol selectivity of 92 % were obtained over Cu/ZnO-SBA-15 catalyst with total bimetallic loading of 15 wt%. The excellent performance of 15 wt% Cu/ZnO-SBA-15 catalyst is attributed to the presence of well disperses active sites with small particle size, higher Cu surface area, and lower catalytic reducibility.

Keywords: hydrogenation of carbon dioxide, methanol synthesis, metal loading, Cu/ZnO-SBA-15 catalyst

Procedia PDF Downloads 214
26342 Reliability-Based Method for Assessing Liquefaction Potential of Soils

Authors: Mehran Naghizaderokni, Asscar Janalizadechobbasty

Abstract:

This paper explores probabilistic method for assessing the liquefaction potential of sandy soils. The current simplified methods for assessing soil liquefaction potential use a deterministic safety factor in order to determine whether liquefaction will occur or not. However, these methods are unable to determine the liquefaction probability related to a safety factor. A solution to this problem can be found by reliability analysis.This paper presents a reliability analysis method based on the popular certain liquefaction analysis method. The proposed probabilistic method is formulated based on the results of reliability analyses of 190 field records and observations of soil performance against liquefaction. The results of the present study show that confidence coefficient greater and smaller than 1 does not mean safety and/or liquefaction in cadence for liquefaction, and for assuring liquefaction probability, reliability based method analysis should be used. This reliability method uses the empirical acceleration attenuation law in the Chalos area to derive the probability density distribution function and the statistics for the earthquake-induced cyclic shear stress ratio (CSR). The CSR and CRR statistics are used in continuity with the first order and second moment method to calculate the relation between the liquefaction probability, the safety factor and the reliability index. Based on the proposed method, the liquefaction probability related to a safety factor can be easily calculated. The influence of some of the soil parameters on the liquefaction probability can be quantitatively evaluated.

Keywords: liquefaction, reliability analysis, chalos area, civil and structural engineering

Procedia PDF Downloads 456
26341 Computational Fluid Dynamics Analysis of a Biomass Burner Gas Chamber in OpenFOAM

Authors: Óscar Alfonso Gómez Sepúlveda, Julián Ernesto Jaramillo, Diego Camilo Durán

Abstract:

The global climate crisis has affected different aspects of human life, and in an effort to reverse the effects generated, we seek to optimize and improve the equipment and plants that produce high emissions of CO₂, being possible to achieve this through numerical simulations. These equipments include biomass combustion chambers. The objective of this research is to visualize the thermal behavior of a gas chamber that is used in the process of obtaining vegetable extracts. The simulation is carried out with OpenFOAM taking into account the conservation of energy, turbulence, and radiation; for the purposes of the simulation, combustion is omitted and replaced by heat generation. Within the results, the streamlines generated by the primary and secondary flows are analyzed in order to visualize whether they generate the expected effect, and the energy is used to the maximum. The inclusion of radiation seeks to compare its influence and also simplify the computational times to perform mesh analysis. An analysis is carried out with simplified geometries and with experimental data to corroborate the selection of the models to be used, and it is obtained that for turbulence, the appropriate one is the standard k - w. As a means of verification, a general energy balance is made and compared with the results of the numerical analysis, where the error is 1.67%, which is considered acceptable. From the approach to improvement options, it was found that with the implementation of fins, heat can be increased by up to 7.3%.

Keywords: CFD analysis, biomass, heat transfer, radiation, OpenFOAM

Procedia PDF Downloads 105
26340 Analysis of Dynamics Underlying the Observation Time Series by Using a Singular Spectrum Approach

Authors: O. Delage, H. Bencherif, T. Portafaix, A. Bourdier

Abstract:

The main purpose of time series analysis is to learn about the dynamics behind some time ordered measurement data. Two approaches are used in the literature to get a better knowledge of the dynamics contained in observation data sequences. The first of these approaches concerns time series decomposition, which is an important analysis step allowing patterns and behaviors to be extracted as components providing insight into the mechanisms producing the time series. As in many cases, time series are short, noisy, and non-stationary. To provide components which are physically meaningful, methods such as Empirical Mode Decomposition (EMD), Empirical Wavelet Transform (EWT) or, more recently, Empirical Adaptive Wavelet Decomposition (EAWD) have been proposed. The second approach is to reconstruct the dynamics underlying the time series as a trajectory in state space by mapping a time series into a set of Rᵐ lag vectors by using the method of delays (MOD). Takens has proved that the trajectory obtained with the MOD technic is equivalent to the trajectory representing the dynamics behind the original time series. This work introduces the singular spectrum decomposition (SSD), which is a new adaptive method for decomposing non-linear and non-stationary time series in narrow-banded components. This method takes its origin from singular spectrum analysis (SSA), a nonparametric spectral estimation method used for the analysis and prediction of time series. As the first step of SSD is to constitute a trajectory matrix by embedding a one-dimensional time series into a set of lagged vectors, SSD can also be seen as a reconstruction method like MOD. We will first give a brief overview of the existing decomposition methods (EMD-EWT-EAWD). The SSD method will then be described in detail and applied to experimental time series of observations resulting from total columns of ozone measurements. The results obtained will be compared with those provided by the previously mentioned decomposition methods. We will also compare the reconstruction qualities of the observed dynamics obtained from the SSD and MOD methods.

Keywords: time series analysis, adaptive time series decomposition, wavelet, phase space reconstruction, singular spectrum analysis

Procedia PDF Downloads 89