Search results for: discreet event simulation (DES)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5990

Search results for: discreet event simulation (DES)

3800 Modeling and Characterization of Organic LED

Authors: Bouanati Sidi Mohammed, N. E. Chabane Sari, Mostefa Kara Selma

Abstract:

It is well-known that Organic light emitting diodes (OLEDs) are attracting great interest in the display technology industry due to their many advantages, such as low price of manufacturing, large-area of electroluminescent display, various colors of emission included white light. Recently, there has been much progress in understanding the device physics of OLEDs and their basic operating principles. In OLEDs, Light emitting is the result of the recombination of electron and hole in light emitting layer, which are injected from cathode and anode. For improve luminescence efficiency, it is needed that hole and electron pairs exist affluently and equally and recombine swiftly in the emitting layer. The aim of this paper is to modeling polymer LED and OLED made with small molecules for studying the electrical and optical characteristics. The first simulation structures used in this paper is a mono layer device; typically consisting of the poly (2-methoxy-5(2’-ethyl) hexoxy-phenylenevinylene) (MEH-PPV) polymer sandwiched between an anode usually an indium tin oxide (ITO) substrate, and a cathode, such as Al. In the second structure we replace MEH-PPV by tris (8-hydroxyquinolinato) aluminum (Alq3). We choose MEH-PPV because of it's solubility in common organic solvents, in conjunction with a low operating voltage for light emission and relatively high conversion efficiency and Alq3 because it is one of the most important host materials used in OLEDs. In this simulation, the Poole-Frenkel- like mobility model and the Langevin bimolecular recombination model have been used as the transport and recombination mechanism. These models are enabled in ATLAS -SILVACO software. The influence of doping and thickness on I(V) characteristics and luminescence, are reported.

Keywords: organic light emitting diode, polymer lignt emitting diode, organic materials, hexoxy-phenylenevinylene

Procedia PDF Downloads 541
3799 The Direct Deconvolutional Model in the Large-Eddy Simulation of Turbulence

Authors: Ning Chang, Zelong Yuan, Yunpeng Wang, Jianchun Wang

Abstract:

The utilization of Large Eddy Simulation (LES) has been extensive in turbulence research. LES concentrates on resolving the significant grid-scale motions while representing smaller scales through subfilter-scale (SFS) models. The deconvolution model, among the available SFS models, has proven successful in LES of engineering and geophysical flows. Nevertheless, the thorough investigation of how sub-filter scale dynamics and filter anisotropy affect SFS modeling accuracy remains lacking. The outcomes of LES are significantly influenced by filter selection and grid anisotropy, factors that have not been adequately addressed in earlier studies. This study examines two crucial aspects of LES: Firstly, the accuracy of direct deconvolution models (DDM) is evaluated concerning sub-filter scale (SFS) dynamics across varying filter-to-grid ratios (FGR) in isotropic turbulence. Various invertible filters are employed, including Gaussian, Helmholtz I and II, Butterworth, Chebyshev I and II, Cauchy, Pao, and rapidly decaying filters. The importance of FGR becomes evident as it plays a critical role in controlling errors for precise SFS stress prediction. When FGR is set to 1, the DDM models struggle to faithfully reconstruct SFS stress due to inadequate resolution of SFS dynamics. Notably, prediction accuracy improves when FGR is set to 2, leading to accurate reconstruction of SFS stress, except for cases involving Helmholtz I and II filters. Remarkably high precision, nearly 100%, is achieved at an FGR of 4 for all DDM models. Furthermore, the study extends to filter anisotropy and its impact on SFS dynamics and LES accuracy. By utilizing the dynamic Smagorinsky model (DSM), dynamic mixed model (DMM), and direct deconvolution model (DDM) with anisotropic filters, aspect ratios (AR) ranging from 1 to 16 are examined in LES filters. The results emphasize the DDM’s proficiency in accurately predicting SFS stresses under highly anisotropic filtering conditions. Notably high correlation coefficients exceeding 90% are observed in the a priori study for the DDM’s reconstructed SFS stresses, surpassing those of the DSM and DMM models. However, these correlations tend to decrease as filter anisotropy increases. In the a posteriori analysis, the DDM model consistently outperforms the DSM and DMM models across various turbulence statistics, including velocity spectra, probability density functions related to vorticity, SFS energy flux, velocity increments, strainrate tensors, and SFS stress. It is evident that as filter anisotropy intensifies, the results of DSM and DMM deteriorate, while the DDM consistently delivers satisfactory outcomes across all filter-anisotropy scenarios. These findings underscore the potential of the DDM framework as a valuable tool for advancing the development of sophisticated SFS models for LES in turbulence research.

Keywords: deconvolution model, large eddy simulation, subfilter scale modeling, turbulence

Procedia PDF Downloads 57
3798 Investigation of Information Security Incident Management Based on International Standard ISO/IEC 27002 in Educational Hospitals in 2014

Authors: Nahid Tavakoli, Asghar Ehteshami, Akbar Hassanzadeh, Fatemeh Amini

Abstract:

Introduction: The Information security incident management guidelines was been developed to help hospitals to meet their information security event and incident management requirements. The purpose of this Study was to investigate on Information Security Incident Management in Isfahan’s educational hospitals in accordance to ISO/IEC 27002 standards. Methods: This was a cross-sectional study to investigate on Information Security Incident Management of educational hospitals in 2014. Based on ISO/IEC 27002 standards, two checklists were applied to check the compliance with standards on Reporting Information Security Events and Weakness and Management of Information Security Incidents and Improvements. One inspector was trained to carry out the assessments in the hospitals. The data was analyzed by SPSS. Findings: In general the score of compliance Information Security Incident Management requirements in two steps; Reporting Information Security Events and Weakness and Management of Information Security Incidents and Improvements was %60. There was the significant difference in various compliance levels among the hospitals (p-valueKeywords: information security incident management, information security management, standards, hospitals

Procedia PDF Downloads 563
3797 Modeling the Human Harbor: An Equity Project in New York City, New York USA

Authors: Lauren B. Birney

Abstract:

The envisioned long-term outcome of this three-year research, and implementation plan is for 1) teachers and students to design and build their own computational models of real-world environmental-human health phenomena occurring within the context of the “Human Harbor” and 2) project researchers to evaluate the degree to which these integrated Computer Science (CS) education experiences in New York City (NYC) public school classrooms (PreK-12) impact students’ computational-technical skill development, job readiness, career motivations, and measurable abilities to understand, articulate, and solve the underlying phenomena at the center of their models. This effort builds on the partnership’s successes over the past eight years in developing a benchmark Model of restoration-based Science, Technology, Engineering, and Math (STEM) education for urban public schools and achieving relatively broad-based implementation in the nation’s largest public school system. The Billion Oyster Project Curriculum and Community Enterprise for Restoration Science (BOP-CCERS STEM + Computing) curriculum, teacher professional developments, and community engagement programs have reached more than 200 educators and 11,000 students at 124 schools, with 84 waterfront locations and Out of School of Time (OST) programs. The BOP-CCERS Partnership is poised to develop a more refined focus on integrating computer science across the STEM domains; teaching industry-aligned computational methods and tools; and explicitly preparing students from the city’s most under-resourced and underrepresented communities for upwardly mobile careers in NYC’s ever-expanding “digital economy,” in which jobs require computational thinking and an increasing percentage require discreet computer science technical skills. Project Objectives include the following: 1. Computational Thinking (CT) Integration: Integrate computational thinking core practices across existing middle/high school BOP-CCERS STEM curriculum as a means of scaffolding toward long term computer science and computational modeling outcomes. 2. Data Science and Data Analytics: Enabling Researchers to perform interviews with Teachers, students, community members, partners, stakeholders, and Science, Technology, Engineering, and Mathematics (STEM) industry Professionals. Collaborative analysis and data collection were also performed. As a centerpiece, the BOP-CCERS partnership will expand to include a dedicated computer science education partner. New York City Department of Education (NYCDOE), Computer Science for All (CS4ALL) NYC will serve as the dedicated Computer Science (CS) lead, advising the consortium on integration and curriculum development, working in tandem. The BOP-CCERS Model™ also validates that with appropriate application of technical infrastructure, intensive teacher professional developments, and curricular scaffolding, socially connected science learning can be mainstreamed in the nation’s largest urban public school system. This is evidenced and substantiated in the initial phases of BOP-CCERS™. The BOP-CCERS™ student curriculum and teacher professional development have been implemented in approximately 24% of NYC public middle schools, reaching more than 250 educators and 11,000 students directly. BOP-CCERS™ is a fully scalable and transferable educational model, adaptable to all American school districts. In all settings of the proposed Phase IV initiative, the primary beneficiary group will be underrepresented NYC public school students who live in high-poverty neighborhoods and are traditionally underrepresented in the STEM fields, including African Americans, Latinos, English language learners, and children from economically disadvantaged households. In particular, BOP-CCERS Phase IV will explicitly prepare underrepresented students for skilled positions within New York City’s expanding digital economy, computer science, computational information systems, and innovative technology sectors.

Keywords: computer science, data science, equity, diversity and inclusion, STEM education

Procedia PDF Downloads 42
3796 Study of Chlorine Gas Leak Consequences in Direct Chlorination System Failure in Cooling Towers in the Petrochemical Industry

Authors: Mohammad H. Ruhipour, Mahdi Goharrokhi, Mahsa Ghasemi, Artadokht Ostadsarayi

Abstract:

In this paper, we are aiming to study the consequences of chlorine gas leak in direct chlorine gas injection compared to using bleach (sodium hypochlorite), studying the negative effects both on the environment and individuals. This study was performed in the cooling towers of a natural fractioning unit of Bandar-e-IMAM petrochemical plant. Considering that chlorine gas is highly toxic and based on the health regulation, its release into the surrounding environment can be very dangerous for people and even fatal for individuals. We studied performing quantitative studies in the worst cases of event incidence. In addition, studying alternative methods with a lower risk was also on the agenda to select the least likely possible option causing an accident. In this paper chlorine gas release consequences have been evaluated by using PHAST software. Reaching to 10 ppm of chlorine gas concentration was basis of hazardous area determination. The results show that the full chlorine gas line rupture scenario in Pasquill category F, were worst case, and many people could be harmed around cooling towers area because of chlorine gas inhalation.

Keywords: chlorine gas, consequence modeling, cooling towers, direct chlorination, risk assessment, system failure

Procedia PDF Downloads 262
3795 Fixed-Frequency Pulse Width Modulation-Based Sliding Mode Controller for Switching Multicellular Converter

Authors: Rihab Hamdi, Amel Hadri Hamida, Ouafae Bennis, Fatima Babaa, Sakina Zerouali

Abstract:

This paper features a sliding mode controller (SMC) for closed-loop voltage control of DC-DC three-cells buck converter connected in parallel, operating in continuous conduction mode (CCM), based on pulse-width modulation (PWM). To maintain the switching frequency, the approach is to incorporate a pulse-width modulation that utilizes an equivalent control, inferred by applying the SM control method, to produce a control sign to be contrasted and the fixed-frequency within the modulator. Detailed stability and transient performance analysis have been conducted using Lyapunov stability criteria to restrict the switching frequency variation facing wide variations in output load, input changes, and set-point changes. The results obtained confirm the effectiveness of the proposed control scheme in achieving an enhanced output transient performance while faithfully realizing its control objective in the event of abrupt and uncertain parameter variations. Simulations studies in MATLAB/Simulink environment are performed to confirm the idea.

Keywords: DC-DC converter, pulse width modulation, power electronics, sliding mode control

Procedia PDF Downloads 124
3794 Towards an Understanding of Social Capital in an Online Community of Filipino Music Artists

Authors: Jerome V. Cleofas

Abstract:

Cyberspace has become a more viable arena for budding artists to share musical acts through digital forms. The increasing relevance of online communities has attracted scholars from various fields demonstrating its influence on social capital. This paper extends this understanding of social capital among Filipino music artists belonging to the SoundCloud Philippines Facebook Group. The study makes use of various qualitative data obtained from key-informant interviews and participant observation of online and physical encounters, analyzed using the case study approach. Soundcloud Philippines has over seven-hundred members and is composed of Filipino singers, instrumentalists, composers, arrangers, producers, multimedia artists, and event managers. Group interactions are a mix of online encounters based on Facebook and SoundCloud and physical encounters through meet-ups and events. Benefits reaped from the community are informational, technical, instrumental, promotional, motivational, and social support. Under the guidance of online group administrators, collaborative activities such as music productions, concerts and events transpire. Most conflicts and problems arising are resolved peacefully. Social capital in SoundCloud Philippines is mobilized through recognition, respect and reciprocity.

Keywords: Facebook, music artists, online communities, social capital

Procedia PDF Downloads 301
3793 Study of Structural Behavior and Proton Conductivity of Inorganic Gel Paste Electrolyte at Various Phosphorous to Silicon Ratio by Multiscale Modelling

Authors: P. Haldar, P. Ghosh, S. Ghoshdastidar, K. Kargupta

Abstract:

In polymer electrolyte membrane fuel cells (PEMFC), the membrane electrode assembly (MEA) is consisting of two platinum coated carbon electrodes, sandwiched with one proton conducting phosphoric acid doped polymeric membrane. Due to low mechanical stability, flooding and fuel cell crossover, application of phosphoric acid in polymeric membrane is very critical. Phosphorous and silica based 3D inorganic gel gains the attention in the field of supercapacitors, fuel cells and metal hydrate batteries due to its thermally stable highly proton conductive behavior. Also as a large amount of water molecule and phosphoric acid can easily get trapped in Si-O-Si network cavities, it causes a prevention in the leaching out. In this study, we have performed molecular dynamics (MD) simulation and first principle calculations to understand the structural, electronics and electrochemical and morphological behavior of this inorganic gel at various P to Si ratios. We have used dipole-dipole interactions, H bonding, and van der Waals forces to study the main interactions between the molecules. A 'structure property-performance' mapping is initiated to determine optimum P to Si ratio for best proton conductivity. We have performed the MD simulations at various temperature to understand the temperature dependency on proton conductivity. The observed results will propose a model which fits well with experimental data and other literature values. We have also studied the mechanism behind proton conductivity. And finally we have proposed a structure for the gel paste with optimum P to Si ratio.

Keywords: first principle calculation, molecular dynamics simulation, phosphorous and silica based 3D inorganic gel, polymer electrolyte membrane fuel cells, proton conductivity

Procedia PDF Downloads 106
3792 Energy Performance Gaps in Residences: An Analysis of the Variables That Cause Energy Gaps and Their Impact

Authors: Amrutha Kishor

Abstract:

Today, with the rising global warming and depletion of resources every industry is moving toward sustainability and energy efficiency. As part of this movement, it is nowadays obligatory for architects to play their part by creating energy predictions for their designs. But in a lot of cases, these predictions do not reflect the real quantities of energy in newly built buildings when operating. These can be described as ‘Energy Performance Gaps’. This study aims to determine the underlying reasons for these gaps. Seven houses designed by Allan Joyce Architects, UK from 1998 until 2019 were considered for this study. The data from the residents’ energy bills were cross-referenced with the predictions made with the software SefairaPro and from energy reports. Results indicated that the predictions did not match the actual energy usage. An account of how energy was used in these seven houses was made by means of personal interviews. The main factors considered in the study were occupancy patterns, heating systems and usage, lighting profile and usage, and appliances’ profile and usage. The study found that the main reasons for the creation of energy gaps were the discrepancies in occupant usage and patterns of energy consumption that are predicted as opposed to the actual ones. This study is particularly useful for energy-conscious architectural firms to fine-tune the approach to designing houses and analysing their energy performance. As the findings reveal that energy usage in homes varies based on the way residents use the space, it helps deduce the most efficient technological combinations. This information can be used to set guidelines for future policies and regulations related to energy consumption in homes. This study can also be used by the developers of simulation software to understand how architects use their product and drive improvements in its future versions.

Keywords: architectural simulation, energy efficient design, energy performance gaps, environmental design

Procedia PDF Downloads 103
3791 Nonlinear Homogenized Continuum Approach for Determining Peak Horizontal Floor Acceleration of Old Masonry Buildings

Authors: Andreas Rudisch, Ralf Lampert, Andreas Kolbitsch

Abstract:

It is a well-known fact among the engineering community that earthquakes with comparatively low magnitudes can cause serious damage to nonstructural components (NSCs) of buildings, even when the supporting structure performs relatively well. Past research works focused mainly on NSCs of nuclear power plants and industrial plants. Particular attention should also be given to architectural façade elements of old masonry buildings (e.g. ornamental figures, balustrades, vases), which are very vulnerable under seismic excitation. Large numbers of these historical nonstructural components (HiNSCs) can be found in highly frequented historical city centers and in the event of failure, they pose a significant danger to persons. In order to estimate the vulnerability of acceleration sensitive HiNSCs, the peak horizontal floor acceleration (PHFA) is used. The PHFA depends on the dynamic characteristics of the building, the ground excitation, and induced nonlinearities. Consequently, the PHFA can not be generalized as a simple function of height. In the present research work, an extensive case study was conducted to investigate the influence of induced nonlinearity on the PHFA for old masonry buildings. Probabilistic nonlinear FE time-history analyses considering three different hazard levels were performed. A set of eighteen synthetically generated ground motions was used as input to the structure models. An elastoplastic macro-model (multiPlas) for nonlinear homogenized continuum FE-calculation was calibrated to multiple scales and applied, taking specific failure mechanisms of masonry into account. The macro-model was calibrated according to the results of specific laboratory and cyclic in situ shear tests. The nonlinear macro-model is based on the concept of multi-surface rate-independent plasticity. Material damage or crack formation are detected by reducing the initial strength after failure due to shear or tensile stress. As a result, shear forces can only be transmitted to a limited extent by friction when the cracking begins. The tensile strength is reduced to zero. The first goal of the calibration was the consistency of the load-displacement curves between experiment and simulation. The calibrated macro-model matches well with regard to the initial stiffness and the maximum horizontal load. Another goal was the correct reproduction of the observed crack image and the plastic strain activities. Again the macro-model proved to work well in this case and shows very good correlation. The results of the case study show that there is significant scatter in the absolute distribution of the PHFA between the applied ground excitations. An absolute distribution along the normalized building height was determined in the framework of probability theory. It can be observed that the extent of nonlinear behavior varies for the three hazard levels. Due to the detailed scope of the present research work, a robust comparison with code-recommendations and simplified PHFA distributions are possible. The chosen methodology offers a chance to determine the distribution of PHFA along the building height of old masonry structures. This permits a proper hazard assessment of HiNSCs under seismic loads.

Keywords: nonlinear macro-model, nonstructural components, time-history analysis, unreinforced masonry

Procedia PDF Downloads 152
3790 Atmospheric Transport Modeling of Radio-Xenon Detections Possibly Related to the Announced Nuclear Test in North Korea on February 12, 2013

Authors: Kobi Kutsher

Abstract:

On February 12th 2013, monitoring stations of the Preparatory Commission of the Comprehensive Nuclear Test-Ban Treaty Organization (CTBTO) detected a seismic event with explosion-like underground characteristics in the Democratic People’s Republic of Korea (DPRK). The location was found to be in the vicinity of the two previous announced nuclear tests in 2006 and 2009.The nuclear test was also announced by the government of the DPRK.After an underground nuclear explosion, radioactive fission products (mostly noble gases) can seep through layers of rock and sediment until they escape into the atmosphere. The fission products are dispersed in the atmosphere and may be detected thousands of kilometers downwind from the test site. Indeed, more than 7 weeks after the explosion, unusual detections of noble gases was reported at the radionuclide station in Takasaki, Japan. The radionuclide station is a part of the International Monitoring System, operated to verify the CTBT. This study provides an estimation of the possible source region and the total radioactivity of the release using Atmospheric Transport Modeling.

Keywords: atmospheric transport modeling, CTBTO, nuclear tests, radioactive fission products

Procedia PDF Downloads 417
3789 Alternative Robust Estimators for the Shape Parameters of the Burr XII Distribution

Authors: Fatma Zehra Doğru, Olcay Arslan

Abstract:

In this paper, we propose alternative robust estimators for the shape parameters of the Burr XII distribution. We provide a small simulation study and a real data example to illustrate the performance of the proposed estimators over the ML and the LS estimators.

Keywords: burr xii distribution, robust estimator, m-estimator, least squares

Procedia PDF Downloads 417
3788 Field-observed Thermal Fractures during Reinjection and Its Numerical Simulation

Authors: Wen Luo, Phil J. Vardon, Anne-Catherine Dieudonne

Abstract:

One key process that partly controls the success of geothermal projects is fluid reinjection, which benefits in dealing with waste water, maintaining reservoir pressure, and supplying heat-exchange media, etc. Thus, sustaining the injectivity is of great importance for the efficiency and sustainability of geothermal production. However, the injectivity is sensitive to the reinjection process. Field experiences have illustrated that the injectivity can be damaged or improved. In this paper, the focus is on how the injectivity is improved. Since the injection pressure is far below the formation fracture pressure, hydraulic fracturing cannot be the mechanism contributing to the increase in injectivity. Instead, thermal stimulation has been identified as the main contributor to improving the injectivity. For low-enthalpy geothermal reservoirs, which are not fracture-controlled, thermal fracturing, instead of thermal shearing, is expected to be the mechanism for increasing injectivity. In this paper, field data from the sedimentary low-enthalpy geothermal reservoirs in the Netherlands were analysed to show the occurrence of thermal fracturing due to the cooling shock during reinjection. Injection data were collected and compared to show the effects of the thermal fractures on injectivity. Then, a thermo-hydro-mechanical (THM) model for the near field formation was developed and solved by finite element method to simulate the observed thermal fractures. It was then compared with the HM model, decomposed from the THM model, to illustrate the thermal effects on thermal fracturing. Finally, the effects of operational parameters, i.e. injection temperature and pressure, on the changes in injectivity were studied on the basis of the THM model. The field data analysis and simulation results illustrate that the thermal fracturing occurred during reinjection and contributed to the increase in injectivity. The injection temperature was identified as a key parameter that contributes to thermal fracturing.

Keywords: injectivity, reinjection, thermal fracturing, thermo-hydro-mechanical model

Procedia PDF Downloads 204
3787 Ethanol Chlorobenzene Dosimetr Usage for Measuring Dose of the Intraoperative Linear Electron Accelerator System

Authors: Mojtaba Barzegar, Alireza Shirazi, Saied Rabi Mahdavi

Abstract:

Intraoperative radiation therapy (IORT) is an innovative treatment modality that the delivery of a large single dose of radiation to the tumor bed during the surgery. The radiotherapy success depends on the absorbed dose delivered to the tumor. The achievement better accuracy in patient treatment depends upon the measured dose by standard dosimeter such as ionization chamber, but because of the high density of electric charge/pulse produced by the accelerator in the ionization chamber volume, the standard correction factor for ion recombination Ksat calculated with the classic two-voltage method is overestimated so the use of dose/pulse independent dosimeters such as chemical Fricke and ethanol chlorobenzene (ECB) dosimeters have been suggested. Dose measurement is usually calculated and calibrated in the Zmax. Ksat calculated by comparison of ion chamber response and ECB dosimeter at each applicator degree, size, and dose. The relative output factors for IORT applicators have been calculated and compared with experimentally determined values and the results simulated by Monte Carlo software. The absorbed doses have been calculated and measured with statistical uncertainties less than 0.7% and 2.5% consecutively. The relative differences between calculated and measured OF’s were up to 2.5%, for major OF’s the agreement was better. In these conditions, together with the relative absorbed dose calculations, the OF’s could be considered as an indication that the IORT electron beams have been well simulated. These investigations demonstrate the utility of the full Monte Carlo simulation of accelerator head with ECB dosimeter allow us to obtain detailed information of clinical IORT beams.

Keywords: intra operative radiotherapy, ethanol chlorobenzene, ksat, output factor, monte carlo simulation

Procedia PDF Downloads 462
3786 Arousal, Encoding, And Intrusive Memories

Authors: Hannah Gutmann, Rick Richardson, Richard Bryant

Abstract:

Intrusive memories following a traumatic event are not uncommon. However, in some individuals, these memories become maladaptive and lead to prolonged stress reactions. A seminal model of PTSD explains that aberrant processing during trauma may lead to prolonged stress reactions and intrusive memories. This model explains that elevated arousal at the time of the trauma promotes data driven processing, leading to fragmented and intrusive memories. This study investigated the role of elevated arousal on the development of intrusive memories. We measured salivary markers of arousal and investigated what impact this had on data driven processing, memory fragmentation, and subsequently, the development of intrusive memories. We assessed 100 healthy participants to understand their processing style, arousal, and experience of intrusive memories. Participants were randomised to a control or experimental condition, the latter of which was designed to increase their arousal. Based on current theory, participants in the experimental condition were expected to engage in more data driven processing and experience more intrusive memories than participants in the control condition. This research aims to shed light on the mechanisms underlying the development of intrusive memories to illustrate ways in which therapeutic approaches for PTSD may be augmented for greater efficacy.

Keywords: stress, cortisol, SAA, PTSD, intrusive memories

Procedia PDF Downloads 179
3785 Detection of Clipped Fragments in Speech Signals

Authors: Sergei Aleinik, Yuri Matveev

Abstract:

In this paper a novel method for the detection of clipping in speech signals is described. It is shown that the new method has better performance than known clipping detection methods, is easy to implement, and is robust to changes in signal amplitude, size of data, etc. Statistical simulation results are presented.

Keywords: clipping, clipped signal, speech signal processing, digital signal processing

Procedia PDF Downloads 379
3784 Engineering Seismological Studies in and around Zagazig City, Sharkia, Egypt

Authors: M. El-Eraki, A. A. Mohamed, A. A. El-Kenawy, M. S. Toni, S. I. Mustafa

Abstract:

The aim of this paper is to study the ground vibrations using Nakamura technique to evaluate the relation between the ground conditions and the earthquake characteristics. Microtremor measurements were carried out at 55 sites in and around Zagazig city. The signals were processed using horizontal to vertical spectral ratio (HVSR) technique to estimate the fundamental frequencies of the soil deposits and its corresponding H/V amplitude. Seismic measurements were acquired at nine sites for recording the surface waves. The recorded waveforms were processed using the multi-channel analysis of surface waves (MASW) method to infer the shear wave velocity profile. The obtained fundamental frequencies were found to be ranging from 0.7 to 1.7 Hz and the maximum H/V amplitude reached 6.4. These results together with the average shear wave velocity in the surface layers were used for the estimation of the thickness of the upper most soft cover layers (depth to bedrock). The sediment thickness generally increases at the northeastern and southwestern parts of the area, which is in good agreement with the local geological structure. The results of this work showed the zones of higher potential damage in the event of an earthquake in the study area.

Keywords: ambient vibrations, fundamental frequency, surface waves, zagazig

Procedia PDF Downloads 271
3783 Simulation Of A Renal Phantom Using the MAG 3

Authors: Ati Moncef

Abstract:

We describe in this paper the results of a phantom of dynamics renal with MAG3. Our phantom consisted of (tow shaped of kidneys, 1 liver). These phantoms were scanned with static and dynamic protocols and compared with clinical data. in a normal conditions we use our phantoms it's possible to acquire a renal images when we can be compared with clinical scintigraphy. In conclusion, Renal phantom also can use in the quality control of a renal scintigraphy.

Keywords: Renal scintigraphy, MAG3, Nuclear medicine, Gamma Camera.

Procedia PDF Downloads 388
3782 The Communication Library DIALOG for iFDAQ of the COMPASS Experiment

Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius

Abstract:

Modern experiments in high energy physics impose great demands on the reliability, the efficiency, and the data rate of Data Acquisition Systems (DAQ). This contribution focuses on the development and deployment of the new communication library DIALOG for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. The iFDAQ utilizing a hardware event builder is designed to be able to readout data at the maximum rate of the experiment. The DIALOG library is a communication system both for distributed and mixed environments, it provides a network transparent inter-process communication layer. Using the high-performance and modern C++ framework Qt and its Qt Network API, the DIALOG library presents an alternative to the previously used DIM library. The DIALOG library was fully incorporated to all processes in the iFDAQ during the run 2016. From the software point of view, it might be considered as a significant improvement of iFDAQ in comparison with the previous run. To extend the possibilities of debugging, the online monitoring of communication among processes via DIALOG GUI is a desirable feature. In the paper, we present the DIALOG library from several insights and discuss it in a detailed way. Moreover, the efficiency measurement and comparison with the DIM library with respect to the iFDAQ requirements is provided.

Keywords: data acquisition system, DIALOG library, DIM library, FPGA, Qt framework, TCP/IP

Procedia PDF Downloads 304
3781 Experimental Study and Numerical Simulation of the Reaction and Flow on the Membrane Wall of Entrained Flow Gasifier

Authors: Jianliang Xu, Zhenghua Dai, Zhongjie Shen, Haifeng Liu, Fuchen Wang

Abstract:

In an entrained flow gasifier, the combustible components are converted into the gas phase, and the mineral content is converted into ash. Most of the ash particles or droplets are deposited on the refractory or membrane wall and form a slag layer that flows down to the quenching system. The captured particle reaction process and slag flow and phase transformation play an important role in gasifier performance and safe and stable operation. The reaction characteristic of captured char particles on the molten slag had been studied by applied a high-temperature stage microscope. The gasification process of captured chars with CO2 on the slag surface was observed and recorded, compared to the original char gasification. The particle size evolution, heat transfer process are discussed, and the gasification reaction index of the capture char particle are modeled. Molten slag layer promoted the char reactivity from the analysis of reaction index, Coupled with heat transfer analysis, shrinking particle model (SPM) was applied and modified to predict the gasification time at carbon conversion of 0.9, and results showed an agreement with the experimental data. A comprehensive model with gas-particle-slag flow and reaction models was used to model the different industry gasifier. The carbon conversion information in the spatial space and slag layer surface are investigated. The slag flow characteristic, such as slag velocity, molten slag thickness, slag temperature distribution on the membrane wall and refractory brick are discussed.

Keywords: char, slag, numerical simulation, gasification, wall reaction, membrane wall

Procedia PDF Downloads 291
3780 Fluid Structure Interaction Study between Ahead and Angled Impact of AGM 88 Missile Entering Relatively High Viscous Fluid for K-Omega Turbulence Model

Authors: Abu Afree Andalib, Rafiur Rahman, Md Mezbah Uddin

Abstract:

The main objective of this work is to anatomize on the various parameters of AGM 88 missile anatomized using FSI module in Ansys. Computational fluid dynamics is used for the study of fluid flow pattern and fluidic phenomenon such as drag, pressure force, energy dissipation and shockwave distribution in water. Using finite element analysis module of Ansys, structural parameters such as stress and stress density, localization point, deflection, force propagation is determined. Separate analysis on structural parameters is done on Abacus. State of the art coupling module is used for FSI analysis. Fine mesh is considered in every case for better result during simulation according to computational machine power. The result of the above-mentioned parameters is analyzed and compared for two phases using graphical representation. The result of Ansys and Abaqus are also showed. Computational Fluid Dynamics and Finite Element analyses and subsequently the Fluid-Structure Interaction (FSI) technique is being considered. Finite volume method and finite element method are being considered for modelling fluid flow and structural parameters analysis. Feasible boundary conditions are also utilized in the research. Significant change in the interaction and interference pattern while the impact was found. Theoretically as well as according to simulation angled condition was found with higher impact.

Keywords: FSI (Fluid Surface Interaction), impact, missile, high viscous fluid, CFD (Computational Fluid Dynamics), FEM (Finite Element Analysis), FVM (Finite Volume Method), fluid flow, fluid pattern, structural analysis, AGM-88, Ansys, Abaqus, meshing, k-omega, turbulence model

Procedia PDF Downloads 451
3779 Exploring the Challenges to Usage of Building Construction Cost Indices in Ghana

Authors: Jerry Gyimah, Ernest Kissi, Safowaa Osei-Tutu, Charles Dela Adobor, Theophilus Adjei-Kumi, Ernest Osei-Tutu

Abstract:

Price fluctuation contract is imperative and of paramount essence, in the construction industry as it provides adequate relief and cushioning for changes in the prices of input resources during construction. As a result, several methods have been devised to better help in arriving at fair recompense in the event of price changes. However, stakeholders often appear not to be satisfied with the existing methods of fluctuation evaluation, ostensibly because of the challenges associated with them. The aim of this study was to identify the challenges to the usage of building construction cost indices in Ghana. Data was gathered from contractors and quantity surveying firms. The study utilized a survey questionnaire approach to elicit responses from the contractors and the consultants. Data gathered was analyzed scientifically, using the relative importance index (RII) to rank the problems associated with the existing methods. The findings revealed the following, among others, late release of data, inadequate recovery of costs, and work items of interest not included in the published indices as the main challenges of the existing methods. Findings provide useful lessons for policymakers and practitioners in decision making towards the usage and improvement of available indices.

Keywords: building construction cost indices, challenges, usage, Ghana

Procedia PDF Downloads 134
3778 Applying Wavelet Transform to Ferroresonance Detection and Protection

Authors: Chun-Wei Huang, Jyh-Cherng Gu, Ming-Ta Yang

Abstract:

Non-synchronous breakage or line failure in power systems with light or no loads can lead to core saturation in transformers or potential transformers. This can cause component and capacitance matching resulting in the formation of resonant circuits, which trigger ferroresonance. This study employed a wavelet transform for the detection of ferroresonance. Simulation results demonstrate the efficacy of the proposed method.

Keywords: ferroresonance, wavelet transform, intelligent electronic device, transformer

Procedia PDF Downloads 482
3777 An Automatic Generating Unified Modelling Language Use Case Diagram and Test Cases Based on Classification Tree Method

Authors: Wassana Naiyapo, Atichat Sangtong

Abstract:

The processes in software development by Object Oriented methodology have many stages those take time and high cost. The inconceivable error in system analysis process will affect to the design and the implementation process. The unexpected output causes the reason why we need to revise the previous process. The more rollback of each process takes more expense and delayed time. Therefore, the good test process from the early phase, the implemented software is efficient, reliable and also meet the user’s requirement. Unified Modelling Language (UML) is the tool which uses symbols to describe the work process in Object Oriented Analysis (OOA). This paper presents the approach for automatically generated UML use case diagram and test cases. UML use case diagram is generated from the event table and test cases are generated from use case specifications and Graphic User Interfaces (GUI). Test cases are derived from the Classification Tree Method (CTM) that classify data to a node present in the hierarchy structure. Moreover, this paper refers to the program that generates use case diagram and test cases. As the result, it can reduce work time and increase efficiency work.

Keywords: classification tree method, test case, UML use case diagram, use case specification

Procedia PDF Downloads 150
3776 Architectural Wind Data Maps Using an Array of Wireless Connected Anemometers

Authors: D. Serero, L. Couton, J. D. Parisse, R. Leroy

Abstract:

In urban planning, an increasing number of cities require wind analysis to verify comfort of public spaces and around buildings. These studies are made using computer fluid dynamic simulation (CFD). However, this technique is often based on wind information taken from meteorological stations located at several kilometers of the spot of analysis. The approximated input data on project surroundings produces unprecise results for this type of analysis. They can only be used to get general behavior of wind in a zone but not to evaluate precise wind speed. This paper presents another approach to this problem, based on collecting wind data and generating an urban wind cartography using connected ultrasound anemometers. They are wireless devices that send immediate data on wind to a remote server. Assembled in array, these devices generate geo-localized data on wind such as speed, temperature, pressure and allow us to compare wind behavior on a specific site or building. These Netatmo-type anemometers communicate by wifi with central equipment, which shares data acquired by a wide variety of devices such as wind speed, indoor and outdoor temperature, rainfall, and sunshine. Beside its precision, this method extracts geo-localized data on any type of site that can be feedback looped in the architectural design of a building or a public place. Furthermore, this method allows a precise calibration of a virtual wind tunnel using numerical aeraulic simulations (like STAR CCM + software) and then to develop the complete volumetric model of wind behavior over a roof area or an entire city block. The paper showcases connected ultrasonic anemometers, which were implanted for an 18 months survey on four study sites in the Grand Paris region. This case study focuses on Paris as an urban environment with multiple historical layers whose diversity of typology and buildings allows considering different ways of capturing wind energy. The objective of this approach is to categorize the different types of wind in urban areas. This, particularly the identification of the minimum and maximum wind spectrum, helps define the choice and performance of wind energy capturing devices that could be implanted there. The localization on the roof of a building, the type of wind, the altimetry of the device in relation to the levels of the roofs, the potential nuisances generated. The method allows identifying the characteristics of wind turbines in order to maximize their performance in an urban site with turbulent wind.

Keywords: computer fluid dynamic simulation in urban environment, wind energy harvesting devices, net-zero energy building, urban wind behavior simulation, advanced building skin design methodology

Procedia PDF Downloads 85
3775 The Analysis of Gizmos Online Program as Mathematics Diagnostic Program: A Story from an Indonesian Private School

Authors: Shofiayuningtyas Luftiani

Abstract:

Some private schools in Indonesia started integrating the online program Gizmos in the teaching-learning process. Gizmos was developed to supplement the existing curriculum by integrating it into the instructional programs. The program has some features using an inquiry-based simulation, in which students conduct exploration by using a worksheet while teachers use the teacher guidelines to direct and assess students’ performance In this study, the discussion about Gizmos highlights its features as the assessment media of mathematics learning for secondary school students. The discussion is based on the case study and literature review from the Indonesian context. The purpose of applying Gizmos as an assessment media refers to the diagnostic assessment. As a part of the diagnostic assessment, the teachers review the student exploration sheet, analyze particularly in the students’ difficulties and consider findings in planning future learning process. This assessment becomes important since the teacher needs the data about students’ persistent weaknesses. Additionally, this program also helps to build student’ understanding by its interactive simulation. Currently, the assessment over-emphasizes the students’ answers in the worksheet based on the provided answer keys while students perform their skill in translating the question, doing the simulation and answering the question. Whereas, the assessment should involve the multiple perspectives and sources of students’ performance since teacher should adjust the instructional programs with the complexity of students’ learning needs and styles. Consequently, the approach to improving the assessment components is selected to challenge the current assessment. The purpose of this challenge is to involve not only the cognitive diagnosis but also the analysis of skills and error. Concerning the selected setting for this diagnostic assessment that develops the combination of cognitive diagnosis, skills analysis and error analysis, the teachers should create an assessment rubric. The rubric plays the important role as the guide to provide a set of criteria for the assessment. Without the precise rubric, the teacher potentially ineffectively documents and follows up the data about students at risk of failure. Furthermore, the teachers who employ the program of Gizmos as the diagnostic assessment might encounter some obstacles. Based on the condition of assessment in the selected setting, the obstacles involve the time constrain, the reluctance of higher teaching burden and the students’ behavior. Consequently, the teacher who chooses the Gizmos with those approaches has to plan, implement and evaluate the assessment. The main point of this assessment is not in the result of students’ worksheet. However, the diagnostic assessment has the two-stage process; the process to prompt and effectively follow-up both individual weaknesses and those of the learning process. Ultimately, the discussion of Gizmos as the media of the diagnostic assessment refers to the effort to improve the mathematical learning process.

Keywords: diagnostic assessment, error analysis, Gizmos online program, skills analysis

Procedia PDF Downloads 165
3774 A Combined CFD Simulation of Plateau Borders including Films and Transitional Areas of Liquid Foams

Authors: Abdolhamid Anazadehsayed, Jamal Naser

Abstract:

An integrated computational fluid dynamics model is developed for a combined simulation of Plateau borders, films, and transitional areas between the film and the Plateau borders to reduce the simplifications and shortcomings of available models for foam drainage in micro-scale. Additionally, the counter-flow related to the Marangoni effect in the transitional area is investigated. The results of this combined model show the contribution of the films, the exterior Plateau borders, and Marangoni flow in the drainage process more accurately since the inter-influence of foam's elements is included in this study. The exterior Plateau borders flow rate can be four times larger than the interior ones. The exterior bubbles can be more prominent in the drainage process in cases where the number of the exterior Plateau borders increases due to the geometry of container. The ratio of the Marangoni counter-flow to the Plateau border flow increases drastically with an increase in the mobility of air-liquid interface. However, the exterior bubbles follow the same trend with much less intensity since typically, the flow is less dependent on the interface of air-liquid in the exterior bubbles. Moreover, the Marangoni counter-flow in a near-wall transition area is less important than an internal one. The influence of air-liquid interface mobility on the average velocity of interior foams is attained with more accuracy with more realistic boundary condition. Then it has been compared with other numerical and analytical results. The contribution of films in the drainage is significant for the mobile foams as the velocity of flow in the film has the same order of magnitude as the velocity in the Plateau border. Nevertheless, for foams with rigid interfaces, film's contribution in foam drainage is insignificant, particularly for the films near the wall of the container.

Keywords: foam, plateau border, film, Marangoni, CFD, bubble

Procedia PDF Downloads 331
3773 The Trigger-DAQ System in the Mu2e Experiment

Authors: Antonio Gioiosa, Simone Doanti, Eric Flumerfelt, Luca Morescalchi, Elena Pedreschi, Gianantonio Pezzullo, Ryan A. Rivera, Franco Spinella

Abstract:

The Mu2e experiment at Fermilab aims to measure the charged-lepton flavour violating neutrino-less conversion of a negative muon into an electron in the field of an aluminum nucleus. With the expected experimental sensitivity, Mu2e will improve the previous limit of four orders of magnitude. The Mu2e data acquisition (DAQ) system provides hardware and software to collect digitized data from the tracker, calorimeter, cosmic ray veto, and beam monitoring systems. Mu2e’s trigger and data acquisition system (TDAQ) uses otsdaq as its solution. developed at Fermilab, otsdaq uses the artdaq DAQ framework and art analysis framework, under-the-hood, for event transfer, filtering, and processing. Otsdaq is an online DAQ software suite with a focus on flexibility and scalability while providing a multi-user, web-based interface accessible through the Chrome or Firefox web browser. The detector read out controller (ROC) from the tracker and calorimeter stream out zero-suppressed data continuously to the data transfer controller (DTC). Data is then read over the PCIe bus to a software filter algorithm that selects events which are finally combined with the data flux that comes from a cosmic ray veto system (CRV).

Keywords: trigger, daq, mu2e, Fermilab

Procedia PDF Downloads 142
3772 Electroencephalogram Based Approach for Mental Stress Detection during Gameplay with Level Prediction

Authors: Priyadarsini Samal, Rajesh Singla

Abstract:

Many mobile games come with the benefits of entertainment by introducing stress to the human brain. In recognizing this mental stress, the brain-computer interface (BCI) plays an important role. It has various neuroimaging approaches which help in analyzing the brain signals. Electroencephalogram (EEG) is the most commonly used method among them as it is non-invasive, portable, and economical. Here, this paper investigates the pattern in brain signals when introduced with mental stress. Two healthy volunteers played a game whose aim was to search hidden words from the grid, and the levels were chosen randomly. The EEG signals during gameplay were recorded to investigate the impacts of stress with the changing levels from easy to medium to hard. A total of 16 features of EEG were analyzed for this experiment which includes power band features with relative powers, event-related desynchronization, along statistical features. Support vector machine was used as the classifier, which resulted in an accuracy of 93.9% for three-level stress analysis; for two levels, the accuracy of 92% and 98% are achieved. In addition to that, another game that was similar in nature was played by the volunteers. A suitable regression model was designed for prediction where the feature sets of the first and second game were used for testing and training purposes, respectively, and an accuracy of 73% was found.

Keywords: brain computer interface, electroencephalogram, regression model, stress, word search

Procedia PDF Downloads 173
3771 The Role of Instruction in Knowledge Construction in Online Learning

Authors: Soo Hyung Kim

Abstract:

Two different learning approaches were suggested: focusing on factual knowledge or focusing on the embedded meaning in the statements. Each way of learning has positive effects on different question categories, where factual knowledge helps more with simple fact questions, and searching for meaning in given information helps learn causal relationship and the embedded meaning. To test this belief, two groups of learners (12 male and 39 female adults aged 18-37) watched a ten-minute long Youtube video about various factual events of American history, their meaning, and the causal relations of the events. The fact group was asked to focus on factual knowledge in the video, and the meaning group was asked to focus on the embedded meaning in the video. After watching the video, both groups took multiple-choice questions, which consisted of 10 questions asking the factual knowledge addressed in the video and 10 questions asking embedded meaning in the video, such as the causal relationship between historical events and the significance of the event. From ANCOVA analysis, it was found that the factual knowledge showed higher performance on the factual questions than the meaning group, although there was no group difference on the questions about the meaning between the two groups. The finding suggests that teacher instruction plays an important role in learners constructing a different type of knowledge in online learning.

Keywords: factual knowledge, instruction, meaning-based knowledge, online learning

Procedia PDF Downloads 120