Search results for: dual phase lag model
18017 The Use of Performance Indicators for Evaluating Models of Drying Jackfruit (Artocarpus heterophyllus L.): Page, Midilli, and Lewis
Authors: D. S. C. Soares, D. G. Costa, J. T. S., A. K. S. Abud, T. P. Nunes, A. M. Oliveira Júnior
Abstract:
Mathematical models of drying are used for the purpose of understanding the drying process in order to determine important parameters for design and operation of the dryer. The jackfruit is a fruit with high consumption in the Northeast and perishability. It is necessary to apply techniques to improve their conservation for longer in order to diffuse it by regions with low consumption. This study aimed to analyse several mathematical models (Page, Lewis, and Midilli) to indicate one that best fits the conditions of convective drying process using performance indicators associated with each model: accuracy (Af) and noise factors (Bf), mean square error (RMSE) and standard error of prediction (% SEP). Jackfruit drying was carried out in convective type tray dryer at a temperature of 50°C for 9 hours. It is observed that the model Midili was more accurate with Af: 1.39, Bf: 1.33, RMSE: 0.01%, and SEP: 5.34. However, the use of the Model Midilli is not appropriate for purposes of control process due to need four tuning parameters. With the performance indicators used in this paper, the Page model showed similar results with only two parameters. It is concluded that the best correlation between the experimental and estimated data is given by the Page’s model.Keywords: drying, models, jackfruit, biotechnology
Procedia PDF Downloads 38318016 Determination of Johnson-Cook Material and Failure Model Constants for High Tensile Strength Tendon Steel in Post-Tensioned Concrete Members
Authors: I. Gkolfinopoulos, N. Chijiwa
Abstract:
To evaluate the remaining capacity in concrete tensioned members, it is important to accurately estimate damage in precast concrete tendons. In this research Johnson-Cook model and damage parameters of high-strength steel material were calculated by static and dynamic uniaxial tensile tests. Replication of experimental results was achieved through finite element analysis for both single 8-noded three-dimensional element as well as the full-scale dob-bone shaped model and relevant model parameters are proposed. Finally, simulation results in terms of strain and deformation were verified using digital image correlation analysis.Keywords: DIC analysis, Johnson-Cook, quasi-static, dynamic, rupture, tendon
Procedia PDF Downloads 15218015 Business Logic and Environmental Policy, a Research Agenda for the Business-to-Citizen Business Model
Authors: Mats Nilsson
Abstract:
The European electricity markets have been changing from a regulated market, to in some places a deregulated market, and are now experiencing a strong influence of renewable support systems. Firm’s that rely on subsidies have a different business logic than firms acting in a market context. The article proposes that an offspring to the regular business models, the business-to-citizen, should be used. The case of the European electricity market frames the concept of a business-citizen business model, and a research agenda for this concept is outlined.Keywords: business logic, business model, subsidies, business-to-citizen
Procedia PDF Downloads 46718014 Additional Method for the Purification of Lanthanide-Labeled Peptide Compounds Pre-Purified by Weak Cation Exchange Cartridge
Authors: K. Eryilmaz, G. Mercanoglu
Abstract:
Aim: Purification of the final product, which is the last step in the synthesis of lanthanide-labeled peptide compounds, can be accomplished by different methods. Among these methods, the two most commonly used methods are C18 solid phase extraction (SPE) and weak cation exchanger cartridge elution. SPE C18 solid phase extraction method yields high purity final product, while elution from the weak cation exchanger cartridge is pH dependent and ineffective in removing colloidal impurities. The aim of this work is to develop an additional purification method for the lanthanide-labeled peptide compound in cases where the desired radionuclidic and radiochemical purity of the final product can not be achieved because of pH problem or colloidal impurity. Material and Methods: For colloidal impurity formation, 3 mL of water for injection (WFI) was added to 30 mCi of 177LuCl3 solution and allowed to stand for 1 day. 177Lu-DOTATATE was synthesized using EZAG ML-EAZY module (10 mCi/mL). After synthesis, the final product was mixed with the colloidal impurity solution (total volume:13 mL, total activity: 40 mCi). The resulting mixture was trapped in SPE-C18 cartridge. The cartridge was washed with 10 ml saline to remove impurities to the waste vial. The product trapped in the cartridge was eluted with 2 ml of 50% ethanol and collected to the final product vial via passing through a 0.22μm filter. The final product was diluted with 10 mL of saline. Radiochemical purity before and after purification was analysed by HPLC method. (column: ACE C18-100A. 3µm. 150 x 3.0mm, mobile phase: Water-Acetonitrile-Trifluoro acetic acid (75:25:1), flow rate: 0.6 mL/min). Results: UV and radioactivity detector results in HPLC analysis showed that colloidal impurities were completely removed from the 177Lu-DOTATATE/ colloidal impurity mixture by purification method. Conclusion: The improved purification method can be used as an additional method to remove impurities that may result from the lanthanide-peptide synthesis in which the weak cation exchange purification technique is used as the last step. The purification of the final product and the GMP compliance (the final aseptic filtration and the sterile disposable system components) are two major advantages.Keywords: lanthanide, peptide, labeling, purification, radionuclide, radiopharmaceutical, synthesis
Procedia PDF Downloads 16618013 Comparison of Cognitive Load in Virtual Reality and Conventional Simulation-Based Training: A Randomized Controlled Trial
Authors: Michael Wagner, Philipp Steinbauer, Andrea Katharina Lietz, Alexander Hoffelner, Johannes Fessler
Abstract:
Background: Cardiopulmonary resuscitations are stressful situations in which vital decisions must be made within seconds. Lack of routine due to the infrequency of pediatric emergencies can lead to serious medical and communication errors. Virtual reality can fundamentally change the way simulation training is conducted in the future. It appears to be a useful learning tool for technical and non-technical skills. It is important to investigate the use of VR in providing a strong sense of presence within simulations. Methods: In this randomized study, we will enroll doctors and medical students from the Medical University of Vienna, who will receive learning material regarding the resuscitation of a one-year-old child. The study will be conducted in three phases. In the first phase, 20 physicians and 20 medical students from the Medical University of Vienna will be included. They will perform simulation-based training with a standardized scenario of a critically ill child with a hypovolemic shock. The main goal of this phase is to establish a baseline for the following two phases to generate comparative values regarding cognitive load and stress. In phase 2 and 3, the same participants will perform the same scenario in a VR setting. In both settings, on three set points of progression, one of three predefined events is triggered. For each event, three different stress levels (easy, medium, difficult) will be defined. Stress and cognitive load will be analyzed using the NASA Task Load Index, eye-tracking parameters, and heart rate. Subsequently, these values will be compared between VR training and traditional simulation-based training. Hypothesis: We hypothesize that the VR training and the traditional training groups will not differ in physiological response (cognitive load, heart rate, and heart rate variability). We further assume that virtual reality training can be used as cost-efficient additional training. Objectives: The aim of this study is to measure cognitive load and stress level during a real-life simulation training and compare it with VR training in order to show that VR training evokes the same physiological response and cognitive load as real-life simulation training.Keywords: virtual reality, cognitive load, simulation, adaptive virtual reality training
Procedia PDF Downloads 11918012 Riesz Mixture Model for Brain Tumor Detection
Authors: Mouna Zitouni, Mariem Tounsi
Abstract:
This research introduces an application of the Riesz mixture model for medical image segmentation for accurate diagnosis and treatment of brain tumors. We propose a pixel classification technique based on the Riesz distribution, derived from an extended Bartlett decomposition. To our knowledge, this is the first study addressing this approach. The Expectation-Maximization algorithm is implemented for parameter estimation. A comparative analysis, using both synthetic and real brain images, demonstrates the superiority of the Riesz model over a recent method based on the Wishart distribution.Keywords: EM algorithm, segmentation, Riesz probability distribution, Wishart probability distribution
Procedia PDF Downloads 2718011 Coarse-Grained Computational Fluid Dynamics-Discrete Element Method Modelling of the Multiphase Flow in Hydrocyclones
Authors: Li Ji, Kaiwei Chu, Shibo Kuang, Aibing Yu
Abstract:
Hydrocyclones are widely used to classify particles by size in industries such as mineral processing and chemical processing. The particles to be handled usually have a broad range of size distributions and sometimes density distributions, which has to be properly considered, causing challenges in the modelling of hydrocyclone. The combined approach of Computational Fluid Dynamics (CFD) and Discrete Element Method (DEM) offers convenience to model particle size/density distribution. However, its direct application to hydrocyclones is computationally prohibitive because there are billions of particles involved. In this work, a CFD-DEM model with the concept of the coarse-grained (CG) model is developed to model the solid-fluid flow in a hydrocyclone. The DEM is used to model the motion of discrete particles by applying Newton’s laws of motion. Here, a particle assembly containing a certain number of particles with same properties is treated as one CG particle. The CFD is used to model the liquid flow by numerically solving the local-averaged Navier-Stokes equations facilitated with the Volume of Fluid (VOF) model to capture air-core. The results are analyzed in terms of fluid and solid flow structures, and particle-fluid, particle-particle and particle-wall interaction forces. Furthermore, the calculated separation performance is compared with the measurements. The results obtained from the present study indicate that this approach can offer an alternative way to examine the flow and performance of hydrocyclonesKeywords: computational fluid dynamics, discrete element method, hydrocyclone, multiphase flow
Procedia PDF Downloads 41218010 BER Estimate of WCDMA Systems with MATLAB Simulation Model
Authors: Suyeb Ahmed Khan, Mahmood Mian
Abstract:
Simulation plays an important role during all phases of the design and engineering of communications systems, from early stages of conceptual design through the various stages of implementation, testing, and fielding of the system. In the present paper, a simulation model has been constructed for the WCDMA system in order to evaluate the performance. This model describes multiusers effects and calculation of BER (Bit Error Rate) in 3G mobile systems using Simulink MATLAB 7.1. Gaussian Approximation defines the multi-user effect on system performance. BER has been analyzed with comparison between transmitting data and receiving data.Keywords: WCDMA, simulations, BER, MATLAB
Procedia PDF Downloads 59418009 Medial Temporal Tau Predicts Memory Decline in Cognitively Unimpaired Elderly
Authors: Angela T. H. Kwan, Saman Arfaie, Joseph Therriault, Zahra Azizi, Firoza Z. Lussier, Cecile Tissot, Mira Chamoun, Gleb Bezgin, Stijn Servaes, Jenna Stevenon, Nesrine Rahmouni, Vanessa Pallen, Serge Gauthier, Pedro Rosa-Neto
Abstract:
Alzheimer’s disease (AD) can be detected in living people using in vivo biomarkers of amyloid-β (Aβ) and tau, even in the absence of cognitive impairment during the preclinical phase. [¹⁸F]-MK-6420 is a high affinity positron emission tomography (PET) tracer that quantifies tau neurofibrillary tangles, but its ability to predict cognitive changes associated with early AD symptoms, such as memory decline, is unclear. Here, we assess the prognostic accuracy of baseline [18F]-MK-6420 tau PET for predicting longitudinal memory decline in asymptomatic elderly individuals. In a longitudinal observational study, we evaluated a cohort of cognitively normal elderly participants (n = 111) from the Translational Biomarkers in Aging and Dementia (TRIAD) study (data collected between October 2017 and July 2020, with a follow-up period of 12 months). All participants underwent tau PET with [¹⁸F]-MK-6420 and Aβ PET with [¹⁸F]-AZD-4694. The exclusion criteria included the presence of head trauma, stroke, or other neurological disorders. There were 111 eligible participants who were chosen based on the availability of Aβ PET, tau PET, magnetic resonance imaging (MRI), and APOEε4 genotyping. Among these participants, the mean (SD) age was 70.1 (8.6) years; 20 (18%) were tau PET positive, and 71 of 111 (63.9%) were women. A significant association between baseline Braak I-II [¹⁸F]-MK-6240 SUVR positivity and change in composite memory score was observed at the 12-month follow-up, after correcting for age, sex, and years of education (Logical Memory and RAVLT, standardized beta = -0.52 (-0.82-0.21), p < 0.001, for dichotomized tau PET and -1.22 (-1.84-(-0.61)), p < 0.0001, for continuous tau PET). Moderate cognitive decline was observed for A+T+ over the follow-up period, whereas no significant change was observed for A-T+, A+T-, and A-T-, though it should be noted that the A-T+ group was small.Our results indicate that baseline tau neurofibrillary tangle pathology is associated with longitudinal changes in memory function, supporting the use of [¹⁸F]-MK-6420 PET to predict the likelihood of asymptomatic elderly individuals experiencing future memory decline. Overall, [¹⁸F]-MK-6420 PET is a promising tool for predicting memory decline in older adults without cognitive impairment at baseline. This is of critical relevance as the field is shifting towards a biological model of AD defined by the aggregation of pathologic tau. Therefore, early detection of tau pathology using [¹⁸F]-MK-6420 PET provides us with the hope that living patients with AD may be diagnosed during the preclinical phase before it is too late.Keywords: alzheimer’s disease, braak I-II, in vivo biomarkers, memory, PET, tau
Procedia PDF Downloads 8318008 Deconstructing Local Area Networks Using MaatPeace
Authors: Gerald Todd
Abstract:
Recent advances in random epistemologies and ubiquitous theory have paved the way for web services. Given the current status of linear-time communication, cyberinformaticians compellingly desire the exploration of link-level acknowledgements. In order to realize this purpose, we concentrate our efforts on disconfirming that DHTs and model checking are mostly incompatible.Keywords: LAN, cyberinformatics, model checking, communication
Procedia PDF Downloads 40418007 Computational Simulations on Stability of Model Predictive Control for Linear Discrete-Time Stochastic Systems
Authors: Tomoaki Hashimoto
Abstract:
Model predictive control is a kind of optimal feedback control in which control performance over a finite future is optimized with a performance index that has a moving initial time and a moving terminal time. This paper examines the stability of model predictive control for linear discrete-time systems with additive stochastic disturbances. A sufficient condition for the stability of the closed-loop system with model predictive control is derived by means of a linear matrix inequality. The objective of this paper is to show the results of computational simulations in order to verify the validity of the obtained stability condition.Keywords: computational simulations, optimal control, predictive control, stochastic systems, discrete-time systems
Procedia PDF Downloads 43618006 Surface-Enhanced Raman Detection in Chip-Based Chromatography via a Droplet Interface
Authors: Renata Gerhardt, Detlev Belder
Abstract:
Raman spectroscopy has attracted much attention as a structurally descriptive and label-free detection method. It is particularly suited for chemical analysis given as it is non-destructive and molecules can be identified via the fingerprint region of the spectra. In this work possibilities are investigated how to integrate Raman spectroscopy as a detection method for chip-based chromatography, making use of a droplet interface. A demanding task in lab-on-a-chip applications is the specific and sensitive detection of low concentrated analytes in small volumes. Fluorescence detection is frequently utilized but restricted to fluorescent molecules. Furthermore, no structural information is provided. Another often applied technique is mass spectrometry which enables the identification of molecules based on their mass to charge ratio. Additionally, the obtained fragmentation pattern gives insight into the chemical structure. However, it is only applicable as an end-of-the-line detection because analytes are destroyed during measurements. In contrast to mass spectrometry, Raman spectroscopy can be applied on-chip and substances can be processed further downstream after detection. A major drawback of Raman spectroscopy is the inherent weakness of the Raman signal, which is due to the small cross-sections associated with the scattering process. Enhancement techniques, such as surface enhanced Raman spectroscopy (SERS), are employed to overcome the poor sensitivity even allowing detection on a single molecule level. In SERS measurements, Raman signal intensity is improved by several orders of magnitude if the analyte is in close proximity to nanostructured metal surfaces or nanoparticles. The main gain of lab-on-a-chip technology is the building block-like ability to seamlessly integrate different functionalities, such as synthesis, separation, derivatization and detection on a single device. We intend to utilize this powerful toolbox to realize Raman detection in chip-based chromatography. By interfacing on-chip separations with a droplet generator, the separated analytes are encapsulated into numerous discrete containers. These droplets can then be injected with a silver nanoparticle solution and investigated via Raman spectroscopy. Droplet microfluidics is a sub-discipline of microfluidics which instead of a continuous flow operates with the segmented flow. Segmented flow is created by merging two immiscible phases (usually an aqueous phase and oil) thus forming small discrete volumes of one phase in the carrier phase. The study surveys different chip designs to realize coupling of chip-based chromatography with droplet microfluidics. With regards to maintaining a sufficient flow rate for chromatographic separation and ensuring stable eluent flow over the column different flow rates of eluent and oil phase are tested. Furthermore, the detection of analytes in droplets with surface enhanced Raman spectroscopy is examined. The compartmentalization of separated compounds preserves the analytical resolution since the continuous phase restricts dispersion between the droplets. The droplets are ideal vessels for the insertion of silver colloids thus making use of the surface enhancement effect and improving the sensitivity of the detection. The long-term goal of this work is the first realization of coupling chip based chromatography with droplets microfluidics to employ surface enhanced Raman spectroscopy as means of detection.Keywords: chip-based separation, chip LC, droplets, Raman spectroscopy, SERS
Procedia PDF Downloads 25018005 Assessment of the Impacts of Climate Change on Watershed Runoff Using Soil and Water Assessment Tool Model in Southeast Nigeria
Authors: Samuel Emeka Anarah, Kingsley Nnaemeka Ogbu, Obasi Arinze
Abstract:
Quantifying the hydrological response due to changes in climate change is imperative for proper management of water resources within a watershed. The impact of climate change on the hydrology of the Upper Ebony River (UER) watershed, South East Nigeria, was studied using the Soil and Water Assessment Tool (SWAT) hydrological model. A climatological time series analysis from 1985 - 2014 using non-parametric test showed significant negative trends in precipitation and relative humidity trend while minimum and maximum temperature, solar radiation and wind speed showed significant positive trends. Future hypothetical land-use change scenarios (Scenarios 1, 2, 3 and 4) representing urbanization and conversion of forest to agricultural land were combined with future downscaled climate model (CSIRO-Mk3-6-0) and simulated in SWAT model. Relative to the Baseline scenario (2005 - 2014), the results showed a decrease in streamflow by 10.29%, 26.20%, 11.80% and 26.72% for Scenarios 1, 2, 3, and 4 respectively. Model results suggest development of adaptation strategies to cope with the predicted hydrological conditions under future climate change in the watershed.Keywords: climate change, hydrology, runoff, SWAT model
Procedia PDF Downloads 15018004 English Loanwords in the Egyptian Variety of Arabic: Morphological and Phonological Changes
Authors: Mohamed Yacoub
Abstract:
This paper investigates the English loanwords in the Egyptian variety of Arabic and reaches three findings. Data, in the first finding, were collected from Egyptian movies and soap operas; over two hundred words have been borrowed from English, code-switching was not included. These words then have been put into eleven different categories according to their use and part of speech. Finding two addresses the morphological and phonological change that occurred to these words. Regarding the phonological change, eight categories were found in both consonant and vowel variation, five for consonants and three for vowels. Examples were given for each. Regarding the morphological change, five categories were found including the masculine, feminine, dual, broken, and non-pluralize-able nouns. The last finding is the answers to a four-question survey that addresses forty eight native speakers of Egyptian Arabic and found that most participants did not recognize English borrowed words and thought they were originally Arabic and could not give Arabic equivalents for the loanwords that they could recognize.Keywords: sociolinguistics, loanwords, borrowing, morphology, phonology, variation, Egyptian dialect
Procedia PDF Downloads 39018003 Dual-Polarized Multi-Antenna System for Massive MIMO Cellular Communications
Authors: Naser Ojaroudi Parchin, Haleh Jahanbakhsh Basherlou, Raed A. Abd-Alhameed, Peter S. Excell
Abstract:
In this paper, a multiple-input/multiple-output (MIMO) antenna design with polarization and radiation pattern diversity is presented for future smartphones. The configuration of the design consists of four double-fed circular-ring antenna elements located at different edges of the printed circuit board (PCB) with an FR-4 substrate and overall dimension of 75×150 mm2. The antenna elements are fed by 50-Ohm microstrip-lines and provide polarization and radiation pattern diversity function due to the orthogonal placement of their feed lines. A good impedance bandwidth (S11 ≤ -10 dB) of 3.4-3.8 GHz has been obtained for the smartphone antenna array. However, for S11 ≤ -6 dB, this value is 3.25-3.95 GHz. More than 3 dB realized gain and 80% total efficiency are achieved for the single-element radiator. The presented design not only provides the required radiation coverage but also generates the polarization diversity characteristic.Keywords: cellular communications, multiple-input/multiple-output systems, mobile-phone antenna, polarization diversity
Procedia PDF Downloads 14418002 Unveiling Coaching Style of PE Teachers: A Convergent Parallel Approach
Authors: Arazan Jane V., Badiang, Ronesito Jr. R., Clavesillas Cristine Joy H., Belleza Saramie S.
Abstract:
This study examined the coaching style among the PE Teachers in terms of Autonomy, Supportive style, and Controlling Style. On the other hand, gives opportunities to an athlete to be independent, task-oriented, and acknowledge their feelings and perspective of each individual. A controlling coaching style is also portrayed by the rises and falls over an athlete's training development; when this variance is identified, it might harm training. The selection of the respondents of the study will use a random sample of High School PE teachers of the Division of Davao del Norte with a total of 78 High School PE teachers, which can be broken down into 70 High School PE Teachers for Quantitative data for the survey questionnaire and 8 PE Teachers for Qualitative data (IDI). In the quantitative phase, a set of survey questionnaires will be used to gather data from the participants—the extent of the Implementation Questionnaire. The tool will be a researcher-made questionnaire based on the Coaching Styles of selected High School PE teachers of Davao Del Norte. In the qualitative phase, an interview guide questionnaire will be used. Focus group discussions will be conducted to determine themes and patterns or participants' experiences and insights. The researchers conclude that the degree of coaching style among PE Teachers from the Division of Davao del Norte is high, as seen by the findings of this study, and that coaching style among these teachers is highly noticeable.Keywords: supportive autonomy style, controlling style, live experiences, exemplified
Procedia PDF Downloads 10018001 Research on Straightening Process Model Based on Iteration and Self-Learning
Authors: Hong Lu, Xiong Xiao
Abstract:
Shaft parts are widely used in machinery industry, however, bending deformation often occurred when this kind of parts is being heat treated. This parts needs to be straightened to meet the requirement of straightness. As for the pressure straightening process, a good straightening stroke algorithm is related to the precision and efficiency of straightening process. In this paper, the relationship between straightening load and deflection during the straightening process is analyzed, and the mathematical model of the straightening process has been established. By the mathematical model, the iterative method is used to solve the straightening stroke. Compared to the traditional straightening stroke algorithm, straightening stroke calculated by this method is much more precise; because it can adapt to the change of material performance parameters. Considering that the straightening method is widely used in the mass production of the shaft parts, knowledge base is used to store the data of the straightening process, and a straightening stroke algorithm based on empirical data is set up. In this paper, the straightening process control model which combine the straightening stroke method based on iteration and straightening stroke algorithm based on empirical data has been set up. Finally, an experiment has been designed to verify the straightening process control model.Keywords: straightness, straightening stroke, deflection, shaft parts
Procedia PDF Downloads 33018000 Design and Implementation of LabVIEW Based Relay Autotuning Controller for Level Setup
Authors: Manoj M. Sarode, Sharad P. Jadhav, Mukesh D. Patil, Pushparaj S. Suryawanshi
Abstract:
Even though the PID controller is widely used in industrial process, tuning of PID parameters are not easy. It is a time consuming and requires expert people. Another drawback of PID controller is that process dynamics might change over time. This can happen due to variation of the process load, normal wear and tear etc. To compensate for process behavior change over time, expert users are required to recalibrate the PID gains. Implementation of model based controllers usually needs a process model. Identification of process model is time consuming job and no guaranty of model accuracy. If the identified model is not accurate, performance of the controller may degrade. Model based controllers are quite expensive and the whole procedure for the implementation is sometimes tedious. To eliminate such issues Autotuning PID controller becomes vital element. Software based Relay Feedback Autotuning Controller proves to be efficient, upgradable and maintenance free controller. In Relay Feedback Autotune controller PID parameters can be achieved with a very short span of time. This paper presents the real time implementation of LabVIEW based Relay Feedback Autotuning PID controller. It is successfully developed and implemented to control level of a laboratory setup. Its performance is analyzed for different setpoints and found satisfactorily.Keywords: autotuning, PID, liquid level control, recalibrate, labview, controller
Procedia PDF Downloads 39717999 Thorium-Doped PbS Thin Films for Radiation Damage Studies
Authors: Michael Shandalov, Tzvi Templeman, Michael Schmidt, Itzhak Kelson, Eyal Yahel
Abstract:
We present a new method to produce a model system for the study of radiation damage in non-radioactive materials. The method is based on homogeneously incorporating 228Th ions in PbS thin films using a small volume chemical bath deposition (CBD) technique. The common way to alloy metals with radioactive elements is by melting pure elements, which requires considerable amounts of radioactive material with its safety consequences such as high sample activity. Controlled doping of the thin films with (very) small amounts (100-200ppm) of radioactive elements such as thorium is expected to provide a unique path for studying radiation damage in materials due to decay processes without the need of sealed enclosure. As a first stage, we developed CBD process for controlled doping of PbS thin films (~100 nm thick) with the stable isotope (t1/2~106 years), 232Th. Next, we developed CBD process for controlled doping of PbS thin films with active 228Th isotope. This was achieved by altering deposition parameters such as temperature, pH, reagent concentrations and time. The 228Th-doped films were characterized using X-ray diffraction, which indicated a single phase material. Film morphology and thickness were determined using scanning electron microscopy (SEM). Energy dispersive spectroscopy (EDS) mapping in the analytical transmission electron microscope (A-TEM), X-ray photoelectron spectroscopy (XPS) depth profiles and autoradiography indicated that the Th ions were homogeneously distributed throughout the films, suggesting Pb substitution by Th ions in the crystal lattice. The properties of the PbS (228Th) film activity were investigated by using alpha-spectroscopy and gamma spectroscopy. The resulting films are applicable for isochronal annealing of resistivity measurements and currently under investigation. This work shows promise as a model system for the analysis of dilute defect systems in semiconductor thin films.Keywords: thin films, doping, radiation damage, chemical bath deposition
Procedia PDF Downloads 39717998 High-Fidelity 1D Dynamic Model of a Hydraulic Servo Valve Using 3D Computational Fluid Dynamics and Electromagnetic Finite Element Analysis
Authors: D. Henninger, A. Zopey, T. Ihde, C. Mehring
Abstract:
The dynamic performance of a 4-way solenoid operated hydraulic spool valve has been analyzed by means of a one-dimensional modeling approach capturing flow, magnetic and fluid forces, valve inertia forces, fluid compressibility, and damping. Increased model accuracy was achieved by analyzing the detailed three-dimensional electromagnetic behavior of the solenoids and flow behavior through the spool valve body for a set of relevant operating conditions, thereby allowing the accurate mapping of flow and magnetic forces on the moving valve body, in lieu of representing the respective forces by lower-order models or by means of simplistic textbook correlations. The resulting high-fidelity one-dimensional model provided the basis for specific and timely design modification eliminating experimentally observed valve oscillations.Keywords: dynamic performance model, high-fidelity model, 1D-3D decoupled analysis, solenoid-operated hydraulic servo valve, CFD and electromagnetic FEA
Procedia PDF Downloads 18017997 A Study on the Waiting Time for the First Employment of Arts Graduates in Sri Lanka
Authors: Imali T. Jayamanne, K. P. Asoka Ramanayake
Abstract:
Transition from tertiary level education to employment is one of the challenges that many fresh university graduates face after graduation. The transition period or the waiting time to obtain the first employment varies with the socio-economic factors and the general characteristics of a graduate. Compared to other fields of study, Arts graduates in Sri Lanka, have to wait a long time to find their first employment. The objective of this study is to identify the determinants of the transition from higher education to employment of these graduates using survival models. The study is based on a survey that was conducted in the year 2016 on a stratified random sample of Arts graduates from Sri Lankan universities who had graduated in 2012. Among the 469 responses, 36 (8%) waiting times were interval censored and 13 (3%) were right censored. Waiting time for the first employment varied between zero to 51 months. Initially, the log-rank and the Gehan-Wilcoxon tests were performed to identify the significant factors. Gender, ethnicity, GCE Advanced level English grade, civil status, university, class received, degree type, sector of first employment, type of first employment and the educational qualifications required for the first employment were significant at 10%. The Cox proportional hazards model was fitted to model the waiting time for first employment with these significant factors. All factors, except ethnicity and type of employment were significant at 5%. However, since the proportional hazard assumption was violated, the lognormal Accelerated failure time (AFT) model was fitted to model the waiting time for the first employment. The same factors were significant in the AFT model as in Cox proportional model.Keywords: AFT model, first employment, proportional hazard, survey design, waiting time
Procedia PDF Downloads 31617996 An Outsourcing System Model for the Thai Electrical Appliances Industry
Authors: Sudawan Somjai
Abstract:
The purpose of this paper was to find an appropriate outsourcing system model for the Thai electrical appliances industry. The objective was to increase competitive capability of the industry with an outsourcing system. The population for this study was the staff in the selected 10 companies in Thai electrical appliances industry located in Bangkok and the eastern part of Thailand. Data collecting techniques included in-depth interviews, focus group and storytelling techniques. The data was collected from 5 key informants from each company, making a total of 50 informants. The findings revealed that an outsourcing model would consist of important factors including outsourcing system, labor flexibility, capability of business process, manpower management efficiency, cost reduction, business risk elimination, core competency and competitiveness. Different suggestions were made as well in this research paper.Keywords: outsourcing system, model, Thailand, electrical appliances industry
Procedia PDF Downloads 59317995 A Consideration on the Offset Frontal Impact Modeling Using Spring-Mass Model
Authors: Jaemoon Lim
Abstract:
To construct the lumped spring-mass model considering the occupants for the offset frontal crash, the SISAME software and the NHTSA test data were used. The data on 56 kph 40% offset frontal vehicle to deformable barrier crash test of a MY2007 Mazda 6 4-door sedan were obtained from NHTSA test database. The overall behaviors of B-pillar and engine of simulation models agreed very well with the test data. The trends of accelerations at the driver and passenger head were similar but big differences in peak values. The differences of peak values caused the large errors of the HIC36 and 3 ms chest g’s. To predict well the behaviors of dummies, the spring-mass model for the offset frontal crash needs to be improved.Keywords: chest g’s, HIC36, lumped spring-mass model, offset frontal impact, SISAME
Procedia PDF Downloads 46217994 A Model for Optimizing Inventory Replenishment and Shelf Space Management in Retail Industries
Authors: Nermine A. Harraz, Aliaa Abouali
Abstract:
The retail stores put up for sale multiple items while the spaces in the backroom and display areas constitute a scarce resource. Availability, volume, and location of the product displayed in the showroom influence the customer’s demand. Managing these operations individually will result in sub-optimal overall retail store’s profit; therefore, a non-linear integer programming model (NLIP) is developed to determine the inventory replenishment and shelf space allocation decisions that together maximize the retailer’s profit under shelf space and backroom storage constraints taking into consideration that the demand rate is positively dependent on the amount and location of items displayed in the showroom. The developed model is solved using LINGO® software. The NLIP model is implemented in a real world case study in a large retail outlet providing a large variety of products. The proposed model is validated and shows logical results when using the experimental data collected from the market.Keywords: retailing management, inventory replenishment, shelf space allocation, showroom, backroom
Procedia PDF Downloads 35617993 Enhancing Cloud Computing with Security Trust Model
Authors: John Ayoade
Abstract:
Cloud computing is a model that enables the delivery of on-demand computing resources such as networks, servers, storage, applications and services over the internet. Cloud Computing is a relatively growing concept that presents a good number of benefits for its users; however, it also raises some security challenges which may slow down its use. In this paper, we identify some of those security issues that can serve as barriers to realizing the full benefits that cloud computing can bring. One of the key security problems is security trust. A security trust model is proposed that can enhance the confidence that users need to fully trust the use of public and mobile cloud computing and maximize the potential benefits that they offer.Keywords: cloud computing, trust, security, certificate authority, PKI
Procedia PDF Downloads 48717992 Marginalized Two-Part Joint Models for Generalized Gamma Family of Distributions
Authors: Mohadeseh Shojaei Shahrokhabadi, Ding-Geng (Din) Chen
Abstract:
Positive continuous outcomes with a substantial number of zero values and incomplete longitudinal follow-up are quite common in medical cost data. To jointly model semi-continuous longitudinal cost data and survival data and to provide marginalized covariate effect estimates, a marginalized two-part joint model (MTJM) has been developed for outcome variables with lognormal distributions. In this paper, we propose MTJM models for outcome variables from a generalized gamma (GG) family of distributions. The GG distribution constitutes a general family that includes approximately all of the most frequently used distributions like the Gamma, Exponential, Weibull, and Log Normal. In the proposed MTJM-GG model, the conditional mean from a conventional two-part model with a three-parameter GG distribution is parameterized to provide the marginal interpretation for regression coefficients. In addition, MTJM-gamma and MTJM-Weibull are developed as special cases of MTJM-GG. To illustrate the applicability of the MTJM-GG, we applied the model to a set of real electronic health record data recently collected in Iran, and we provided SAS code for application. The simulation results showed that when the outcome distribution is unknown or misspecified, which is usually the case in real data sets, the MTJM-GG consistently outperforms other models. The GG family of distribution facilitates estimating a model with improved fit over the MTJM-gamma, standard Weibull, or Log-Normal distributions.Keywords: marginalized two-part model, zero-inflated, right-skewed, semi-continuous, generalized gamma
Procedia PDF Downloads 18417991 A New Model to Perform Preliminary Evaluations of Complex Systems for the Production of Energy for Buildings: Case Study
Authors: Roberto de Lieto Vollaro, Emanuele de Lieto Vollaro, Gianluca Coltrinari
Abstract:
The building sector is responsible, in many industrialized countries, for about 40% of the total energy requirements, so it seems necessary to devote some efforts in this area in order to achieve a significant reduction of energy consumption and of greenhouse gases emissions. The paper presents a study aiming at providing a design methodology able to identify the best configuration of the system building/plant, from a technical, economic and environmentally point of view. Normally, the classical approach involves a building's energy loads analysis under steady state conditions, and subsequent selection of measures aimed at improving the energy performance, based on previous experience made by architects and engineers in the design team. Instead, the proposed approach uses a sequence of two well known scientifically validated calculation methods (TRNSYS and RETScreen), that allow quite a detailed feasibility analysis. To assess the validity of the calculation model, an existing, historical building in Central Italy, that will be the object of restoration and preservative redevelopment, was selected as a case-study. The building is made of a basement and three floors, with a total floor area of about 3,000 square meters. The first step has been the determination of the heating and cooling energy loads of the building in a dynamic regime by means of TRNSYS, which allows to simulate the real energy needs of the building in function of its use. Traditional methodologies, based as they are on steady-state conditions, cannot faithfully reproduce the effects of varying climatic conditions and of inertial properties of the structure. With TRNSYS it is possible to obtain quite accurate and reliable results, that allow to identify effective combinations building-HVAC system. The second step has consisted of using output data obtained with TRNSYS as input to the calculation model RETScreen, which enables to compare different system configurations from the energy, environmental and financial point of view, with an analysis of investment, and operation and maintenance costs, so allowing to determine the economic benefit of possible interventions. The classical methodology often leads to the choice of conventional plant systems, while RETScreen provides a financial-economic assessment for innovative energy systems and low environmental impact. Computational analysis can help in the design phase, particularly in the case of complex structures with centralized plant systems, by comparing the data returned by the calculation model RETScreen for different design options. For example, the analysis performed on the building, taken as a case study, found that the most suitable plant solution, taking into account technical, economic and environmental aspects, is the one based on a CCHP system (Combined Cooling, Heating, and Power) using an internal combustion engine.Keywords: energy, system, building, cooling, electrical
Procedia PDF Downloads 57617990 Different Sampling Schemes for Semi-Parametric Frailty Model
Authors: Nursel Koyuncu, Nihal Ata Tutkun
Abstract:
Frailty model is a survival model that takes into account the unobserved heterogeneity for exploring the relationship between the survival of an individual and several covariates. In the recent years, proposed survival models become more complex and this feature causes convergence problems especially in large data sets. Therefore selection of sample from these big data sets is very important for estimation of parameters. In sampling literature, some authors have defined new sampling schemes to predict the parameters correctly. For this aim, we try to see the effect of sampling design in semi-parametric frailty model. We conducted a simulation study in R programme to estimate the parameters of semi-parametric frailty model for different sample sizes, censoring rates under classical simple random sampling and ranked set sampling schemes. In the simulation study, we used data set recording 17260 male Civil Servants aged 40–64 years with complete 10-year follow-up as population. Time to death from coronary heart disease is treated as a survival-time and age, systolic blood pressure are used as covariates. We select the 1000 samples from population using different sampling schemes and estimate the parameters. From the simulation study, we concluded that ranked set sampling design performs better than simple random sampling for each scenario.Keywords: frailty model, ranked set sampling, efficiency, simple random sampling
Procedia PDF Downloads 21417989 Design of a Tool for Generating Test Cases from BPMN
Authors: Prat Yotyawilai, Taratip Suwannasart
Abstract:
Business Process Model and Notation (BPMN) is more important in the business process and creating functional models, and is a standard for OMG, which becomes popular in various organizations and in education. Researches related to software testing based on models are prominent. Although most researches use the UML model in software testing, not many researches use the BPMN Model in creating test cases. Therefore, this research proposes a design of a tool for generating test cases from the BPMN. The model is analyzed and the details of the various components are extracted before creating a flow graph. Both details of components and the flow graph are used in generating test cases.Keywords: software testing, test case, BPMN, flow graph
Procedia PDF Downloads 56017988 An Approach for Modeling CMOS Gates
Authors: Spyridon Nikolaidis
Abstract:
A modeling approach for CMOS gates is presented based on the use of the equivalent inverter. A new model for the inverter has been developed using a simplified transistor current model which incorporates the nanoscale effects for the planar technology. Parametric expressions for the output voltage are provided as well as the values of the output and supply current to be compatible with the CCS technology. The model is parametric according the input signal slew, output load, transistor widths, supply voltage, temperature and process. The transistor widths of the equivalent inverter are determined by HSPICE simulations and parametric expressions are developed for that using a fitting procedure. Results for the NAND gate shows that the proposed approach offers sufficient accuracy with an average error in propagation delay about 5%.Keywords: CMOS gate modeling, inverter modeling, transistor current mode, timing model
Procedia PDF Downloads 426