Search results for: code blue response time
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 22834

Search results for: code blue response time

22324 Comparison of FNTD and OSLD Detectors' Responses to Light Ion Beams Using Monte Carlo Simulations and Exprimental Data

Authors: M. R. Akbari, H. Yousefnia, A. Ghasemi

Abstract:

Al2O3:C,Mg fluorescent nuclear track detector (FNTD) and Al2O3:C optically stimulated luminescence detector (OSLD) are becoming two of the applied detectors in ion dosimetry. Therefore, the response of these detectors to hadron beams is highly of interest in radiation therapy (RT) using ion beams. In this study, these detectors' responses to proton and Helium-4 ion beams were compared using Monte Carlo simulations. The calculated data for proton beams were compared with Markus ionization chamber (IC) measurement (in water phantom) from M.D. Anderson proton therapy center. Monte Carlo simulations were performed via the FLUKA code (version 2011.2-17). The detectors were modeled in cylindrical shape at various depths of the water phantom without shading each other for obtaining relative depth dose in the phantom. Mono-energetic parallel ion beams in different incident energies (100 MeV/n to 250 MeV/n) were collided perpendicularly on the phantom surface. For proton beams, the results showed that the simulated detectors have over response relative to IC measurements in water phantom. In all cases, there were good agreements between simulated ion ranges in the water with calculated and experimental results reported by the literature. For proton, maximum peak to entrance dose ratio in the simulated water phantom was 4.3 compared with about 3 obtained from IC measurements. For He-4 ion beams, maximum peak to entrance ratio calculated by both detectors was less than 3.6 in all energies. Generally, it can be said that FLUKA is a good tool to calculate Al2O3:C,Mg FNTD and Al2O3:C OSLD detectors responses to therapeutic proton and He-4 ion beams. It can also calculate proton and He-4 ion ranges with a reasonable accuracy.

Keywords: comparison, FNTD and OSLD detectors response, light ion beams, Monte Carlo simulations

Procedia PDF Downloads 323
22323 A Semidefinite Model to Quantify Dynamic Forces in the Powertrain of Torque Regulated Bascule Bridge Machineries

Authors: Kodo Sektani, Apostolos Tsouvalas, Andrei Metrikine

Abstract:

The reassessment of existing movable bridges in The Netherlands has created the need for acceptance/rejection criteria to assess whether the machineries are meet certain design demands. However, the existing design code defines a different limit state design, meant for new machineries which is based on a simple linear spring-mass model. Observations show that existing bridges do not confirm the model predictions. In fact, movable bridges are nonlinear systems consisting of mechanical components, such as, gears, electric motors and brakes. Next to that, each movable bridge is characterized by a unique set of parameters. However, in the existing code various variables that describe the physical characteristics of the bridge are neglected or replaced by partial factors. For instance, the damping ratio ζ, which is different for drawbridges compared to bascule bridges, is taken as a constant for all bridge types. In this paper, a model is developed that overcomes some of the limitations of existing modelling approaches to capture the dynamics of the powertrain of a class of bridge machineries First, a semidefinite dynamic model is proposed, which accounts for stiffness, damping, and some additional variables of the physical system, which are neglected by the code, such as nonlinear braking torques. The model gives an upper bound of the peak forces/torques occurring in the powertrain during emergency braking. Second, a discrete nonlinear dynamic model is discussed, with realistic motor torque characteristics during normal operation. This model succeeds to accurately predict the full time history of the occurred stress state of the opening and closing cycle for fatigue purposes.

Keywords: Dynamics of movable bridges, Bridge machinery, Powertrains, Torque measurements

Procedia PDF Downloads 133
22322 Simulation of a Cost Model Response Requests for Replication in Data Grid Environment

Authors: Kaddi Mohammed, A. Benatiallah, D. Benatiallah

Abstract:

Data grid is a technology that has full emergence of new challenges, such as the heterogeneity and availability of various resources and geographically distributed, fast data access, minimizing latency and fault tolerance. Researchers interested in this technology address the problems of the various systems related to the industry such as task scheduling, load balancing and replication. The latter is an effective solution to achieve good performance in terms of data access and grid resources and better availability of data cost. In a system with duplication, a coherence protocol is used to impose some degree of synchronization between the various copies and impose some order on updates. In this project, we present an approach for placing replicas to minimize the cost of response of requests to read or write, and we implement our model in a simulation environment. The placement techniques are based on a cost model which depends on several factors, such as bandwidth, data size and storage nodes.

Keywords: response time, query, consistency, bandwidth, storage capacity, CERN

Procedia PDF Downloads 252
22321 Rest API Based System-level Test Automation for Mobile Applications

Authors: Jisoo Song

Abstract:

Today’s mobile applications are communicating with servers more and more in order to access external services or information. Also, server-side code changes are more frequent than client-side code changes in a mobile application. The frequent changes lead to an increase in testing cost increase. To reduce costs, UI based test automation can be one of the solutions. It is a common automation technique in system-level testing. However, it can be unsuitable for mobile applications. When you automate tests based on UI elements for mobile applications, there are some limitations such as the overhead of script maintenance or the difficulty of finding invisible defects that UI elements cannot represent. To overcome these limitations, we present a new automation technique based on Rest API. You can automate system-level tests through test scripts that you write. These scripts call a series of Rest API in a user’s action sequence. This technique does not require testers to know the internal implementation details, only input and expected output of Rest API. You can easily modify test cases by modifying Rest API input values and also find problems that might not be evident from the UI level by validating output values. For example, when an application receives price information from a payment server and user cannot see it at UI level, Rest API based scripts can check whether price information is correct or not. More than 10 mobile applications at our company are being tested automatically based on Rest API scripts whenever application source code, mostly server source code, is built. We are finding defects right away by setting a script as a build job in CI server. The build job starts when application code builds are completed. This presentation will also include field cases from our company.

Keywords: case studies at SK Planet, introduction of rest API based test automation, limitations of UI based test automation

Procedia PDF Downloads 426
22320 Ecological impacts of Cage Farming: A Case Study of Lake Victoria, Kenya

Authors: Mercy Chepkirui, Reuben Omondi, Paul Orina, Albert Getabu, Lewis Sitoki, Jonathan Munguti

Abstract:

Globally, the decline in capture fisheries as a result of the growing population and increasing awareness of the nutritional benefits of white meat has led to the development of aquaculture. This is anticipated to meet the increasing call for more food for the human population, which is likely to increase further by 2050. Statistics showed that more than 50% of the global future fish diet will come from aquaculture. Aquaculture began commercializing some decades ago; this is accredited to technological advancement from traditional to modern cultural systems, including cage farming. Cage farming technology has been rapidly growing since its inception in Lake Victoria, Kenya. Currently, over 6,000 cages have been set up in Kenyan waters, and this offers an excellent opportunity for recognition of Kenya’s government tactic to eliminate food insecurity and malnutrition, create employment and promote a Blue Economy. However, being an open farming enterprise is likely to emit large bulk of waste hence altering the ecosystem integrity of the lake. This is through increased chlorophyll-a pigments, alteration of the plankton community, macroinvertebrates, fish genetic pollution, transmission of fish diseases and pathogens. Cage farming further increases the nutrient loads leading to the production of harmful algal blooms, thus negatively affecting aquatic and human life. Despite the ecological transformation, cage farming provides a platform for the achievement of the Sustainable Development Goals of 2030, especially the achievement of food security and nutrition. Therefore, there is a need for Integrated Multitrophic Aquaculture as part of Blue Transformation for ecosystem monitoring.

Keywords: aquaculture, ecosystem, blue economy, food security

Procedia PDF Downloads 58
22319 Scalable UI Test Automation for Large-scale Web Applications

Authors: Kuniaki Kudo, Raviraj Solanki, Kaushal Patel, Yash Virani

Abstract:

This research mainly concerns optimizing UI test automation for large-scale web applications. The test target application is the HHAexchange homecare management WEB application that seamlessly connects providers, state Medicaid programs, managed care organizations (MCOs), and caregivers through one platform with large-scale functionalities. This study focuses on user interface automation testing for the WEB application. The quality assurance team must execute many manual users interface test cases in the development process to confirm no regression bugs. The team automated 346 test cases; the UI automation test execution time was over 17 hours. The business requirement was reducing the execution time to release high-quality products quickly, and the quality assurance automation team modernized the test automation framework to optimize the execution time. The base of the WEB UI automation test environment is Selenium, and the test code is written in Python. Adopting a compilation language to write test code leads to an inefficient flow when introducing scalability into a traditional test automation environment. In order to efficiently introduce scalability into Test Automation, a scripting language was adopted. The scalability implementation is mainly implemented with AWS's serverless technology, an elastic container service. The definition of scalability here is the ability to automatically set up computers to test automation and increase or decrease the number of computers running those tests. This means the scalable mechanism can help test cases run parallelly. Then test execution time is dramatically decreased. Also, introducing scalable test automation is for more than just reducing test execution time. There is a possibility that some challenging bugs are detected by introducing scalable test automation, such as race conditions, Etc. since test cases can be executed at same timing. If API and Unit tests are implemented, the test strategies can be adopted more efficiently for this scalability testing. However, in WEB applications, as a practical matter, API and Unit testing cannot cover 100% functional testing since they do not reach front-end codes. This study applied a scalable UI automation testing strategy to the large-scale homecare management system. It confirmed the optimization of the test case execution time and the detection of a challenging bug. This study first describes the detailed architecture of the scalable test automation environment, then describes the actual performance reduction time and an example of challenging issue detection.

Keywords: aws, elastic container service, scalability, serverless, ui automation test

Procedia PDF Downloads 73
22318 Performance of Buildings with Base-Isolation System under Geometric Irregularities

Authors: Firoz Alam Faroque, Ankur Neog

Abstract:

Earthquake causes significant loss of lives and severe damage to infrastructure. Base isolator is one of the most suitable solutions to make a building earthquake resistant. Base isolation consists of installing an isolator along with the steel plates covered with pads of strong material like steel, rubber, etc. In our study, we have used lead rubber bearing (LRB). The basic idea of seismic isolation is based on the reduction of the earthquake-induced inertia forces by shifting the fundamental period of the structure out of dangerous resonance range, and concentration of the deformation and energy dissipation demands at the isolation and energy dissipation systems, which are designed for this purpose. In this paper, RC frame buildings have been modeled and analyzed by response spectrum method using ETABS software. The LRB used in the model is designed as per uniform building code (UBC) 97. It is found that time period for the base isolated structures are higher than that of the fixed base structure and the value of base shear significantly reduces in the case of base-isolated buildings. It has also been found that buildings with vertical irregularities give better performance as compared to building with plan irregularities using base isolators.

Keywords: base isolation, base shear, irregularities in buildings, lead rubber bearing (LRB)

Procedia PDF Downloads 309
22317 Pallet Tracking and Cost Optimization of the Flow of Goods in Logistics Operations by Serial Shipping Container Code

Authors: Dominika Crnjac Milic, Martina Martinovic, Vladimir Simovic

Abstract:

The case study method in this paper shows the implementation of Information Technology (IT) and the Serial Shipping Container Code (SSCC) in a Croatian company that deals with logistics operations and provides logistics services in the cold chain segment. This company is aware of the sensitivity of the goods entrusted to them by the user of the service, as well as of the importance of speed and accuracy in providing logistics services. To that end, it has implemented and used the latest IT to ensure the highest standard of high-quality logistics services to its customers. Looking for efficiency and optimization of supply chain management, while maintaining a high level of quality of the products that are sold, today's users of outsourced logistics services are open to the implementation of new IT products that ultimately deliver savings. By analysing the positive results and the difficulties that arise when using this technology, we aim to provide an insight into the potential of this approach of the logistics service provider.

Keywords: logistics operations, serial shipping container code, information technology, cost optimization

Procedia PDF Downloads 347
22316 Dynamic Response of Nano Spherical Shell Subjected to Termo-Mechanical Shock Using Nonlocal Elasticity Theory

Authors: J. Ranjbarn, A. Alibeigloo

Abstract:

In this paper, we present an analytical method for analysis of nano-scale spherical shell subjected to thermo-mechanical shocks based on nonlocal elasticity theory. Thermo-mechanical properties of nano shpere is assumed to be temperature dependent. Governing partial differential equation of motion is solved analytically by using Laplace transform for time domain and power series for spacial domain. The results in Laplace domain is transferred to time domain by employing the fast inverse Laplace transform (FLIT) method. Accuracy of present approach is assessed by comparing the the numerical results with the results of published work in literature. Furtheremore, the effects of non-local parameter and wall thickness on the dynamic characteristics of the nano-sphere are studied.

Keywords: nano-scale spherical shell, nonlocal elasticity theory, thermomechanical shock, dynamic response

Procedia PDF Downloads 356
22315 A Study on Removal of SO3 in Flue Gas Generated from Power Plant

Authors: E. Y. Jo, S. M. Park, I. S. Yeo, K. K. Kim, S. J. Park, Y. K. Kim, Y. D. Kim, C. G. Park

Abstract:

SO3 is created in small quantities during the combustion of fuel that contains sulfur, with the quantity produced a function of the boiler design, fuel sulfur content, excess air level, and the presence of oxidizing agents. Typically, about 1% of the fuel sulfur will be oxidized to SO3, but it can range from 0.5% to 1.5% depending on various factors. Combustion of fuels that contain oxidizing agents, such as certain types of fuel oil or petroleum coke, can result in even higher levels of oxidation. SO3 levels in the flue gas emitted by combustion are very high, which becomes a cause of machinery corrosion or a visible blue plume. Because of that, power plants firing petroleum residues need to installation of SO3 removal system. In this study, SO3 removal system using salt solution was developed and several salts solutions were tested for obtain optimal solution for SO3 removal system. Response surface methodology was used to optimize the operation parameters such as gas-liquid ratio, concentration of salts.

Keywords: flue gas desulfurization, petroleum cokes, Sulfur trioxide, SO3 removal

Procedia PDF Downloads 496
22314 Model Driven Architecture Methodologies: A Review

Authors: Arslan Murtaza

Abstract:

Model Driven Architecture (MDA) is technique presented by OMG (Object Management Group) for software development in which different models are proposed and converted them into code. The main plan is to identify task by using PIM (Platform Independent Model) and transform it into PSM (Platform Specific Model) and then converted into code. In this review paper describes some challenges and issues that are faced in MDA, type and transformation of models (e.g. CIM, PIM and PSM), and evaluation of MDA-based methodologies.

Keywords: OMG, model driven rrchitecture (MDA), computation independent model (CIM), platform independent model (PIM), platform specific model(PSM), MDA-based methodologies

Procedia PDF Downloads 437
22313 Supplemental VisCo-friction Damping for Dynamical Structural Systems

Authors: Sharad Singh, Ajay Kumar Sinha

Abstract:

Coupled dampers like viscoelastic-frictional dampers for supplemental damping are a newer technique. In this paper, innovative Visco-frictional damping models have been presented and investigated. This paper attempts to couple frictional and fluid viscous dampers into a single unit of supplemental dampers. Visco-frictional damping model is developed by series and parallel coupling of frictional and fluid viscous dampers using Maxwell and Kelvin-Voigat models. The time analysis has been performed using numerical simulation on an SDOF system with varying fundamental periods, subject to a set of 12 ground motions. The simulation was performed using the direct time integration method. MATLAB programming tool was used to carry out the numerical simulation. The response behavior has been analyzed for the varying time period and added damping. This paper compares the response reduction behavior of the two modes of coupling. This paper highlights the performance efficiency of the suggested damping models. It also presents a mathematical modeling approach to visco-frictional dampers and simultaneously suggests the suitable mode of coupling between the two sub-units.

Keywords: hysteretic damping, Kelvin model, Maxwell model, parallel coupling, series coupling, viscous damping

Procedia PDF Downloads 138
22312 Diagnostic Contribution of the MMSE-2:EV in the Detection and Monitoring of the Cognitive Impairment: Case Studies

Authors: Cornelia-Eugenia Munteanu

Abstract:

The goal of this paper is to present the diagnostic contribution that the screening instrument, Mini-Mental State Examination-2: Expanded Version (MMSE-2:EV), brings in detecting the cognitive impairment or in monitoring the progress of degenerative disorders. The diagnostic signification is underlined by the interpretation of the MMSE-2:EV scores, resulted from the test application to patients with mild and major neurocognitive disorders. The original MMSE is one of the most widely used screening tools for detecting the cognitive impairment, in clinical settings, but also in the field of neurocognitive research. Now, the practitioners and researchers are turning their attention to the MMSE-2. To enhance its clinical utility, the new instrument was enriched and reorganized in three versions (MMSE-2:BV, MMSE-2:SV and MMSE-2:EV), each with two forms: blue and red. The MMSE-2 was adapted and used successfully in Romania since 2013. The cases were selected from current practice, in order to cover vast and significant neurocognitive pathology: mild cognitive impairment, Alzheimer’s disease, vascular dementia, mixed dementia, Parkinson’s disease, conversion of the mild cognitive impairment into Alzheimer’s disease. The MMSE-2:EV version was used: it was applied one month after the initial assessment, three months after the first reevaluation and then every six months, alternating the blue and red forms. Correlated with age and educational level, the raw scores were converted in T scores and then, with the mean and the standard deviation, the z scores were calculated. The differences of raw scores between the evaluations were analyzed from the point of view of statistic signification, in order to establish the progression in time of the disease. The results indicated that the psycho-diagnostic approach for the evaluation of the cognitive impairment with MMSE-2:EV is safe and the application interval is optimal. The alternation of the forms prevents the learning phenomenon. The diagnostic accuracy and efficient therapeutic conduct derive from the usage of the national test norms. In clinical settings with a large flux of patients, the application of the MMSE-2:EV is a safe and fast psycho-diagnostic solution. The clinicians can draw objective decisions and for the patients: it doesn’t take too much time and energy, it doesn’t bother them and it doesn’t force them to travel frequently.

Keywords: MMSE-2, dementia, cognitive impairment, neuropsychology

Procedia PDF Downloads 493
22311 Study of Demographic, Hematological Profile and Risk Stratification in Chronic Myeloid Leukemia Patients

Authors: Rajandeep Kaur, Rajeev Gupta

Abstract:

Background: Chronic myeloid leukemia (CML) is the most common leukaemia in India. The annual incidence of chronic myeloid leukemia in India was originally reported to be 0.8 to 2.2 per 1,00,000 population. CML is a clonal disorder that is usually easily diagnosed because the leukemic cells of more than 95% of patients have a distinctive cytogenetic abnormality, the Philadelphia chromosome (Ph1). The approval of tyrosine kinase inhibitors (TKIs), which target BCR-ABL1 kinase activity, has significantly reduced the mortality rate associated with chronic myeloid leukemia (CML) and revolutionized treatment. Material and Methods: 80 diagnosed cases of CML were taken. Investigations were done. Bone marrow and molecular studies were also done and with EUTOS, patients were stratified into low and high-risk groups and then treatment with Imatinib was given to all patients and the molecular response was evaluated at 6 months and 12 months follow up with BCR-ABL by RT-PCR quantitative assay. Results: In the study population, out of 80 patients in the study population, 40 were females and 40 were males, with M: F is 1:1. Out of total 80 patients’ maximum patients (54) were in 31-60 years age group. Our study showed a most common symptom of presentation is abdominal discomfort followed by fever. Out of the total 80 patients, 25 (31.3%) patients had high EUTOS scores and 55 (68.8%) patients had low EUTOS scores. On 6 months follow up 36.3% of patients had Complete Molecular Response, 16.3% of patients had Major Molecular Response and 47.5% of patients had No Molecular Response but on 12 months follow up 71.3% of patients had Complete Molecular Response, 16.25% of patients had Major Molecular Response and 12.5% patients had No Molecular Response. Conclusion: In this study, we found a significant correlation between EUTOS score and Molecular response at 6 months and 12 months follow up after Imatinib therapy.

Keywords: chronic myeloid leukemia, European treatment and outcome study score, hematological response, molecular response, tyrosine kinase inhibitor

Procedia PDF Downloads 84
22310 Primary-Color Emitting Photon Energy Storage Nanophosphors for Developing High Contrast Latent Fingerprints

Authors: G. Swati, D. Haranath

Abstract:

Commercially available long afterglow /persistent phosphors are proprietary materials and hence the exact composition and phase responsible for their luminescent characteristics such as initial intensity and afterglow luminescence time are not known. Further to generate various emission colors, commercially available persistence phosphors are physically blended with fluorescent organic dyes such as rodhamine, kiton and methylene blue etc. Blending phosphors with organic dyes results into complete color coverage in visible spectra, however with time, such phosphors undergo thermal and photo-bleaching. This results in the loss of their true emission color. Hence, the current work is dedicated studies on inorganic based thermally and chemically stable primary color emitting nanophosphors namely SrAl2O4:Eu2+, Dy3+, (CaZn)TiO3:Pr3+, and Sr2MgSi2O7:Eu2+, Dy3+. SrAl2O4: Eu2+, Dy3+ phosphor exhibits a strong excitation in UV and visible region (280-470 nm) with a broad emission peak centered at 514 nm is the characteristic emission of parity allowed 4f65d1→4f7 transitions of Eu2+ (8S7/2→2D5/2). Sunlight excitable Sr2MgSi2O7:Eu2+,Dy3+ nanophosphors emits blue color (464 nm) with Commercial international de I’Eclairage (CIE) coordinates to be (0.15, 0.13) with a color purity of 74 % with afterglow time of > 5 hours for dark adapted human eyes. (CaZn)TiO3:Pr3+ phosphor system possess high color purity (98%) which emits intense, stable and narrow red emission at 612 nm due intra 4f transitions (1D2 → 3H4) with afterglow time of 0.5 hour. Unusual property of persistence luminescence of these nanophoshphors supersedes background effects without losing sensitive information these nanophosphors offer several advantages of visible light excitation, negligible substrate interference, high contrast bifurcation of ridge pattern, non-toxic nature revealing finger ridge details of the fingerprints. Both level 1 and level 2 features from a fingerprint can be studied which are useful for used classification, indexing, comparison and personal identification. facile methodology to extract high contrast fingerprints on non-porous and porous substrates using a chemically inert, visible light excitable, and nanosized phosphorescent label in the dark has been presented. The chemistry of non-covalent physisorption interaction between the long afterglow phosphor powder and sweat residue in fingerprints has been discussed in detail. Real-time fingerprint development on porous and non-porous substrates has also been performed. To conclude, apart from conventional dark vision applications, as prepared primary color emitting afterglow phosphors are potentional candidate for developing high contrast latent fingerprints.

Keywords: fingerprints, luminescence, persistent phosphors, rare earth

Procedia PDF Downloads 187
22309 Relevancy Measures of Errors in Displacements of Finite Elements Analysis Results

Authors: A. B. Bolkhir, A. Elshafie, T. K. Yousif

Abstract:

This paper highlights the methods of error estimation in finite element analysis (FEA) results. It indicates that the modeling error could be eliminated by performing finite element analysis with successively finer meshes or by extrapolating response predictions from an orderly sequence of relatively low degree of freedom analysis results. In addition, the paper eliminates the round-off error by running the code at a higher precision. The paper provides application in finite element analysis results. It draws a conclusion based on results of application of methods of error estimation.

Keywords: finite element analysis (FEA), discretization error, round-off error, mesh refinement, richardson extrapolation, monotonic convergence

Procedia PDF Downloads 467
22308 Real-Time Hybrid Simulation for a Tuned Liquid Column Damper Implementation

Authors: Carlos Riascos, Peter Thomson

Abstract:

Real-time hybrid simulation (RTHS) is a modern cyber-physical technique used for the experimental evaluation of complex systems, that treats the system components with predictable behavior as a numerical substructure and the components that are difficult to model as an experimental substructure. Therefore it is an attractive method for evaluation of the response of civil structures under earthquake, wind and anthropic loads. Another practical application of RTHS is the evaluation of control systems, as these devices are often nonlinear and their characterization is an important step in the design of controllers with the desired performance. In this paper, the response of three-story shear frame controlled by a tuned liquid column damper (TLCD) and subject to base excitation is considered. Both passive and semi-active control strategies were implemented and are compared. While the passive TLCD achieved a reduction of 50% in the acceleration response of the main structure in comparison with the structure without control, the semi-active TLCD achieved a reduction of 70%, and was robust to variations in the dynamic properties of the main structure. In addition, a RTHS was implemented with the main structure modeled as a linear, time-invariant (LTI) system through a state space representation and the TLCD, with both control strategies, was evaluated on a shake table that reproduced the displacement of the virtual structure. Current assessment measures for RTHS were used to quantify the performance with parameters such as generalized amplitude, equivalent time delay between the target and measured displacement of the shake table, and energy error using the measured force, and prove that the RTHS described in this paper is an accurate method for the experimental evaluation of structural control systems.

Keywords: structural control, hybrid simulation, tuned liquid column damper, semi-active sontrol strategy

Procedia PDF Downloads 282
22307 An Approach to Low Velocity Impact Damage Modelling of Variable Stiffness Curved Composite Plates

Authors: Buddhi Arachchige, Hessam Ghasemnejad

Abstract:

In this study, the post impact behavior of curved composite plates subjected to low velocity impact was studied analytically and numerically. Approaches to damage modelling are proposed through the degradation of stiffness in the damaged region by reduction of thickness in the damage region. Spring-mass models were used to model the impact response of the plate and impactor. The study involved designing two damage models to compare and contrast the model best fitted with the numerical results. The theoretical force-time responses were compared with the numerical results obtained through a detailed study carried out in LS-DYNA. The modified damage model established a good prediction with the analytical force-time response for different layups and geometry. This study provides a gateway in selecting the most effective layups for variable stiffness curved composite panels able to withstand a higher impact damage.

Keywords: analytical modelling, composite damage, impact, variable stiffness

Procedia PDF Downloads 258
22306 Effects of Rations with High Amount of Crude Fiber on Rumen Fermentation in Suckler Cows

Authors: H. Scholz, P. Kuehne, G. Heckenberger

Abstract:

Problems during the calving period (December until May) often are results in a high body condition score (BCS) at this time. At the end of the grazing period (frequently after early weaning), however, an increase of BCS can often be observed under German conditions. In the last eight weeks before calving, the body condition should be reduced or at least not increased. Rations with a higher amount of crude fiber can be used (rations with straw or late mowed grass silage). Fermentative digestion of fiber is slow and incomplete; that’s why the fermentative process in the rumen can be reduced over a long feeding time. Viewed in this context, feed intake of suckler cows (8 weeks before calving) in different rations and fermentation in the rumen should be checked by taking rumen fluid. Eight suckler cows (Charolais) were feeding a Total Mixed Ration (TMR) in the last eight weeks before calving and grass silage after calving. By the addition of straw (30 % [TMR1] vs. 60 % [TMR2] of dry matter) was varied the amount of crude fiber in the TMR (grass silage, straw, mineral) before calving. After calving of the cow's grass, silage [GS] was fed ad libitum, and the last measurement of rumen fluid took place on the pasture [PS]. Rumen fluid, plasma, body weight, and backfat thickness were collected. Rumen fluid pH was assessed using an electronic pH meter. Volatile fatty acids (VFA), sedimentation, methylene-blue, and amount of infusorians were measured. From these 4 parameters, an “index of rumen fermentation” [IRF] in the rumen was formed. Fixed effects of treatment (TMR1, TMR2, GS, and PS) and a number of lactations (3-7 lactations) were analyzed by ANOVA using SPSS Version 25.0 (significant by p ≤ 5 %). Rumen fluid pH was significantly influenced by variants (TMR 1 by 6.6; TMR 2 by 6.9; GS by 6.6 and PS by 6.9) but was not affected by other effects. The IRF showed disturbed fermentation in the rumen by feeding the TMR 1+2 with a high amount of crude fiber (Score: > 10.0 points) and a very good environment for fermentation during grazing the pasture (Score: 6.9 points). Furthermore, significant differences were found for VFA, methylene blue, and the number of infusorians. The use of rations with a high amount of crude fiber from weaning to calving may cause deviations from undisturbed fermentation in the rumen and adversely affect the utilization of the feed in the rumen.

Keywords: rumen fermentation, suckler cow, digestibility organic matter, crude fiber

Procedia PDF Downloads 124
22305 Resilience-Vulnerability Interaction in the Context of Disasters and Complexity: Study Case in the Coastal Plain of Gulf of Mexico

Authors: Cesar Vazquez-Gonzalez, Sophie Avila-Foucat, Leonardo Ortiz-Lozano, Patricia Moreno-Casasola, Alejandro Granados-Barba

Abstract:

In the last twenty years, academic and scientific literature has been focused on understanding the processes and factors of coastal social-ecological systems vulnerability and resilience. Some scholars argue that resilience and vulnerability are isolated concepts due to their epistemological origin, while others note the existence of a strong resilience-vulnerability relationship. Here we present an ordinal logistic regression model based on the analytical framework about dynamic resilience-vulnerability interaction along adaptive cycle of complex systems and disasters process phases (during, recovery and learning). In this way, we demonstrate that 1) during the disturbance, absorptive capacity (resilience as a core of attributes) and external response capacity explain the probability of households capitals to diminish the damage, and exposure sets the thresholds about the amount of disturbance that households can absorb, 2) at recovery, absorptive capacity and external response capacity explain the probability of households capitals to recovery faster (resilience as an outcome) from damage, and 3) at learning, adaptive capacity (resilience as a core of attributes) explains the probability of households adaptation measures based on the enhancement of physical capital. As a result, during the disturbance phase, exposure has the greatest weight in the probability of capital’s damage, and households with absorptive and external response capacity elements absorbed the impact of floods in comparison with households without these elements. At the recovery phase, households with absorptive and external response capacity showed a faster recovery on their capital; however, the damage sets the thresholds of recovery time. More importantly, diversity in financial capital increases the probability of recovering other capital, but it becomes a liability so that the probability of recovering the household finances in a longer time increases. At learning-reorganizing phase, adaptation (modifications to the house) increases the probability of having less damage on physical capital; however, it is not very relevant. As conclusion, resilience is an outcome but also core of attributes that interacts with vulnerability along the adaptive cycle and disaster process phases. Absorptive capacity can diminish the damage experienced by floods; however, when exposure overcomes thresholds, both absorptive and external response capacity are not enough. In the same way, absorptive and external response capacity diminish the recovery time of capital, but the damage sets the thresholds in where households are not capable of recovering their capital.

Keywords: absorptive capacity, adaptive capacity, capital, floods, recovery-learning, social-ecological systems

Procedia PDF Downloads 118
22304 A Pole Radius Varying Notch Filter with Transient Suppression for Electrocardiogram

Authors: Ramesh Rajagopalan, Adam Dahlstrom

Abstract:

Noise removal techniques play a vital role in the performance of electrocardiographic (ECG) signal processing systems. ECG signals can be corrupted by various kinds of noise such as baseline wander noise, electromyographic interference, and power-line interference. One of the significant challenges in ECG signal processing is the degradation caused by additive 50 or 60 Hz power-line interference. This work investigates the removal of power line interference and suppression of transient response for filtering noise corrupted ECG signals. We demonstrate the effectiveness of Infinite Impulse Response (IIR) notch filter with time varying pole radius for improving the transient behavior. The temporary change in the pole radius of the filter diminishes the transient behavior. Simulation results show that the proposed IIR filter with time varying pole radius outperforms traditional IIR notch filters in terms of mean square error and transient suppression.

Keywords: notch filter, ECG, transient, pole radius

Procedia PDF Downloads 360
22303 A Novel Approach towards Test Case Prioritization Technique

Authors: Kamna Solanki, Yudhvir Singh, Sandeep Dalal

Abstract:

Software testing is a time and cost intensive process. A scrutiny of the code and rigorous testing is required to identify and rectify the putative bugs. The process of bug identification and its consequent correction is continuous in nature and often some of the bugs are removed after the software has been launched in the market. This process of code validation of the altered software during the maintenance phase is termed as Regression testing. Regression testing ubiquitously considers resource constraints; therefore, the deduction of an appropriate set of test cases, from the ensemble of the entire gamut of test cases, is a critical issue for regression test planning. This paper presents a novel method for designing a suitable prioritization process to optimize fault detection rate and performance of regression test on predefined constraints. The proposed method for test case prioritization m-ACO alters the food source selection criteria of natural ants and is basically a modified version of Ant Colony Optimization (ACO). The proposed m-ACO approach has been coded in 'Perl' language and results are validated using three examples by computation of Average Percentage of Faults Detected (APFD) metric.

Keywords: regression testing, software testing, test case prioritization, test suite optimization

Procedia PDF Downloads 313
22302 Integrating the Athena Vortex Lattice Code into a Multivariate Design Synthesis Optimisation Platform in JAVA

Authors: Paul Okonkwo, Howard Smith

Abstract:

This paper describes a methodology to integrate the Athena Vortex Lattice Aerodynamic Software for automated operation in a multivariate optimisation of the Blended Wing Body Aircraft. The Athena Vortex Lattice code developed at the Massachusetts Institute of Technology by Mark Drela allows for the aerodynamic analysis of aircraft using the vortex lattice method. Ordinarily, the Athena Vortex Lattice operation requires a text file containing the aircraft geometry to be loaded into the AVL solver in order to determine the aerodynamic forces and moments. However, automated operation will be required to enable integration into a multidisciplinary optimisation framework. Automated AVL operation within the JAVA design environment will nonetheless require a modification and recompilation of AVL source code into an executable file capable of running on windows and other platforms without the –X11 libraries. This paper describes the procedure for the integrating the FORTRAN written AVL software for automated operation within the multivariate design synthesis optimisation framework for the conceptual design of the BWB aircraft.

Keywords: aerodynamics, automation, optimisation, AVL, JNI

Procedia PDF Downloads 565
22301 Relationship between Response of the Resistive Sensors on the Chosen Volatile Organic Compounds (VOCs) and Their Concentration

Authors: Marek Gancarz, Agnieszka Nawrocka, Robert Rusinek, Marcin Tadla

Abstract:

Volatile organic compounds (VOCs) are the fungi metabolites in the gaseous form produced during improper storage of agricultural commodities (e.g. grain, food). The spoilt commodities produce a wide range of VOCs including alcohols, esters, aldehydes, ketones, alkanes, alkenes, furans, phenols etc. The characteristic VOCs and odours can be determined by using electronic nose (e-Nose) which contains a matrix of different kinds of sensors e.g. resistive sensors. The aim of the present studies was to determine relationship between response of the resistive sensors on the chosen volatiles and their concentration. According to the literature, it was chosen volatiles characteristic for the cereals: ethanol, 3-methyl-1-butanol and hexanal. Analysis of the sensor signals shows that a signal shape is different for the different substances. Moreover, each VOC signal gives information about a maximum of the normalized sensor response (R/Rmax), an impregnation time (tIM) and a cleaning time at half maximum of R/Rmax (tCL). These three parameters can be regarded as a ‘VOC fingerprint’. Seven resistive sensors (TGS2600-B00, TGS2602-B00, TGS2610-C00, TGS2611-C00, TGS2611-E00, TGS2612-D00, TGS2620-C00) produced by Figaro USA Inc., and one (AS-MLV-P2) produced by AMS AG, Austria were used. Two out of seven sensors (TGS2611-E00, TGS2612-D00) did not react to the chosen VOCs. The most responsive sensor was AS-MLV-P2. The research was supported by the National Centre for Research and Development (NCBR), Grant No. PBS2/A8/22/2013.

Keywords: agricultural commodities, organic compounds, resistive sensors, volatile

Procedia PDF Downloads 349
22300 Evaluating the Effectiveness of Electronic Response Systems in Technology-Oriented Classes

Authors: Ahmad Salman

Abstract:

Electronic Response Systems such as Kahoot, Poll Everywhere, and Google Classroom are gaining a lot of popularity when surveying audiences in events, meetings, and classroom. The reason is mainly because of the ease of use and the convenience these tools bring since they provide mobile applications with a simple user interface. In this paper, we present a case study on the effectiveness of using Electronic Response Systems on student participation and learning experience in a classroom. We use a polling application for class exercises in two different technology-oriented classes. We evaluate the effectiveness of the usage of the polling applications through statistical analysis of the students performance in these two classes and compare them to the performances of students who took the same classes without using the polling application for class participation. Our results show an increase in the performances of the students who used the Electronic Response System when compared to those who did not by an average of 11%.

Keywords: Interactive Learning, Classroom Technology, Electronic Response Systems, Polling Applications, Learning Evaluation

Procedia PDF Downloads 110
22299 Microstructure Evolution and Pre-transformation Microstructure Reconstruction in Ti-6Al-4V Alloy

Authors: Shreyash Hadke, Manendra Singh Parihar, Rajesh Khatirkar

Abstract:

In the present investigation, the variation in the microstructure with the changes in the heat treatment conditions i.e. temperature and time was observed. Ti-6Al-4V alloy was subject to solution annealing treatments in β (1066C) and α+β phase (930C and 850C) followed by quenching, air cooling and furnace cooling to room temperature respectively. The effect of solution annealing and cooling on the microstructure was studied by using optical microscopy (OM), scanning electron microscopy (SEM), electron backscattered diffraction (EBSD) and x-ray diffraction (XRD). The chemical composition of the β phase for different conditions was determined with the help of energy dispersive spectrometer (EDS) attached to SEM. Furnace cooling resulted in the development of coarser structure (α+β), while air cooling resulted in much finer structure with widmanstatten morphology of α at the grain boundaries. Quenching from solution annealing temperature formed α’ martensite, their proportion being dependent on the temperature in β phase field. It is well known that the transformation of β to α follows Burger orientation relationship (OR). In order to reconstruct the microstructure of parent β phase, a MATLAB code was written using neighbor-to-neighbor, triplet method and Tari’s method. The code was tested on the annealed samples (1066C solution annealing temperature followed by furnace cooling to room temperature). The parent phase data thus generated was then plotted using the TSL-OIM software. The reconstruction results of the above methods were compared and analyzed. The Tari’s approach (clustering approach) gave better results compared to neighbor-to-neighbor and triplet method but the time taken by the triplet method was least compared to the other two methods.

Keywords: Ti-6Al-4V alloy, microstructure, electron backscattered diffraction, parent phase reconstruction

Procedia PDF Downloads 430
22298 Active Power Filters and their Smart Grid Integration - Applications for Smart Cities

Authors: Pedro Esteban

Abstract:

Most installations nowadays are exposed to many power quality problems, and they also face numerous challenges to comply with grid code and energy efficiency requirements. The reason behind this is that they are not designed to support nonlinear, non-balanced, and variable loads and generators that make up a large percentage of modern electric power systems. These problems and challenges become especially critical when designing green buildings and smart cities. These problems and challenges are caused by equipment that can be typically found in these installations like variable speed drives (VSD), transformers, lighting, battery chargers, double-conversion UPS (uninterruptible power supply) systems, highly dynamic loads, single-phase loads, fossil fuel generators and renewable generation sources, to name a few. Moreover, events like capacitor switching (from existing capacitor banks or passive harmonic filters), auto-reclose operations of transmission and distribution lines, or the starting of large motors also contribute to these problems and challenges. Active power filters (APF) are one of the fastest-growing power electronics technologies for solving power quality problems and meeting grid code and energy efficiency requirements for a wide range of segments and applications. They are a high performance, flexible, compact, modular, and cost-effective type of power electronics solutions that provide an instantaneous and effective response in low or high voltage electric power systems. They enable longer equipment lifetime, higher process reliability, improved power system capacity and stability, and reduced energy losses, complying with most demanding power quality and energy efficiency standards and grid codes. There can be found several types of active power filters, including active harmonic filters (AHF), static var generators (SVG), active load balancers (ALB), hybrid var compensators (HVC), and low harmonic drives (LHD) nowadays. All these devices can be used in applications in Smart Cities bringing several technical and economic benefits.

Keywords: power quality improvement, energy efficiency, grid code compliance, green buildings, smart cities

Procedia PDF Downloads 100
22297 Cross Project Software Fault Prediction at Design Phase

Authors: Pradeep Singh, Shrish Verma

Abstract:

Software fault prediction models are created by using the source code, processed metrics from the same or previous version of code and related fault data. Some company do not store and keep track of all artifacts which are required for software fault prediction. To construct fault prediction model for such company, the training data from the other projects can be one potential solution. The earlier we predict the fault the less cost it requires to correct. The training data consists of metrics data and related fault data at function/module level. This paper investigates fault predictions at early stage using the cross-project data focusing on the design metrics. In this study, empirical analysis is carried out to validate design metrics for cross project fault prediction. The machine learning techniques used for evaluation is Naïve Bayes. The design phase metrics of other projects can be used as initial guideline for the projects where no previous fault data is available. We analyze seven data sets from NASA Metrics Data Program which offer design as well as code metrics. Overall, the results of cross project is comparable to the within company data learning.

Keywords: software metrics, fault prediction, cross project, within project.

Procedia PDF Downloads 322
22296 Allergenic Potential of Airborne Algae Isolated from Malaysia

Authors: Chu Wan-Loy, Kok Yih-Yih, Choong Siew-Ling

Abstract:

The human health risks due to poor air quality caused by a wide array of microorganisms have attracted much interest. Airborne algae have been reported as early as 19th century and they can be found in the air of tropic and warm atmospheres. Airborne algae normally originate from water surfaces, soil, trees, buildings and rock surfaces. It is estimated that at least 2880 algal cells are inhaled per day by human. However, there are relatively little data published on airborne algae and its related adverse health effects except sporadic reports of algae associated clinical allergenicity. A collection of airborne algae cultures has been established following a recent survey on the occurrence of airborne algae in indoor and outdoor environments in Kuala Lumpur. The aim of this study was to investigate the allergenic potential of the isolated airborne green and blue-green algae, namely Scenedesmus sp., Cylindrospermum sp. and Hapalosiphon sp.. The suspensions of freeze-dried airborne algae were adminstered into balb-c mice model through intra-nasal route to determine their allergenic potential. Results showed that Scenedesmus sp. (1 mg/mL) increased the systemic Ig E levels in mice by 3-8 fold compared to pre-treatment. On the other hand, Cylindrospermum sp. and Hapalosiphon sp. at similar concentration caused the Ig E to increase by 2-4 fold. The potential of airborne algae causing Ig E mediated type 1 hypersensitivity was elucidated using other immunological markers such as cytokine interleukin (IL)- 4, 5, 6 and interferon-ɣ. When we compared the amount of interleukins in mouse serum between day 0 and day 53 (day of sacrifice), Hapalosiphon sp. (1mg/mL) increased the expression of IL4 and 6 by 8 fold while the Cylindrospermum sp. (1mg/mL) increased the expression of IL4 and IFɣ by 8 and 2 fold respectively. In conclusion, repeated exposure to the three selected airborne algae may stimulate the immune response and generate Ig E in a mouse model.

Keywords: airborne algae, respiratory, allergenic, immune response, Malaysia

Procedia PDF Downloads 220
22295 Reduction of Multiple User Interference for Optical CDMA Systems Using Successive Interference Cancellation Scheme

Authors: Tawfig Eltaif, Hesham A. Bakarman, N. Alsowaidi, M. R. Mokhtar, Malek Harbawi

Abstract:

In Commonly, it is primary problem that there is multiple user interference (MUI) noise resulting from the overlapping among the users in optical code-division multiple access (OCDMA) system. In this article, we aim to mitigate this problem by studying an interference cancellation scheme called successive interference cancellation (SIC) scheme. This scheme will be tested on two different detection schemes, spectral amplitude coding (SAC) and direct detection systems (DS), using partial modified prime (PMP) as the signature codes. It was found that SIC scheme based on both SAC and DS methods had a potential to suppress the intensity noise, that is to say, it can mitigate MUI noise. Furthermore, SIC/DS scheme showed much lower bit error rate (BER) performance relative to SIC/SAC scheme for different magnitude of effective power. Hence, many more users can be supported by SIC/DS receiver system.

Keywords: optical code-division multiple access (OCDMA), successive interference cancellation (SIC), multiple user interference (MUI), spectral amplitude coding (SAC), partial modified prime code (PMP)

Procedia PDF Downloads 504