Search results for: code mixing
1893 Theoretical Analysis and Numerical Evaluation of the Flow inside the Supersonic Nozzle for Chemical Lasers
Authors: Mohammedi Ferhate, Hakim Chadli, Laggoun Chaouki
Abstract:
The main objectives of work in this area are, first, obtaining the high laser energies in short time durations needed for the feasibility studies of laser induced thermodynamically exothermic chemical reactions , second, investigating the physical principles that can be used to make laser sources capable of delivering high average powers. We note that, in order to reach both objectives, one has to convert electrical or chemical energy into laser energy, using dense gaseous media.. We present results from the early development of an F atom source appropriate for HF and DF chemical laser research. We next explain the very important difficulties encountered in working with dense gases for that purpose, and we shall describe how, especially at Evaluation of downstream-mixing scheme –levels transitions (001) → (100) and (001) → (020) gas dynamic laser The physical phenomena that control the operation of presently existing laser devices are now sufficiently well understood, so that it is possible to predict that new generations of lasers could be designed in the future. The proposed model of excitation and relaxation levels was finally proved by the computational numerical code of Matlab toolboxes of different parameters of nozzle.Keywords: hydrogen, combust, chemical laser, halogen atom
Procedia PDF Downloads 841892 ROOP: Translating Sequential Code Fragments to Distributed Code Fragments Using Deep Reinforcement Learning
Authors: Arun Sanjel, Greg Speegle
Abstract:
Every second, massive amounts of data are generated, and Data Intensive Scalable Computing (DISC) frameworks have evolved into effective tools for analyzing such massive amounts of data. Since the underlying architecture of these distributed computing platforms is often new to users, building a DISC application can often be time-consuming and prone to errors. The automated conversion of a sequential program to a DISC program will consequently significantly improve productivity. However, synthesizing a user’s intended program from an input specification is complex, with several important applications, such as distributed program synthesizing and code refactoring. Existing works such as Tyro and Casper rely entirely on deductive synthesis techniques or similar program synthesis approaches. Our approach is to develop a data-driven synthesis technique to identify sequential components and translate them to equivalent distributed operations. We emphasize using reinforcement learning and unit testing as feedback mechanisms to achieve our objectives.Keywords: program synthesis, distributed computing, reinforcement learning, unit testing, DISC
Procedia PDF Downloads 1031891 Control and Automation of Fluid at Micro/Nano Scale for Bio-Analysis Applications
Authors: Reza Hadjiaghaie Vafaie, Sevda Givtaj
Abstract:
Automation and control of biological samples and solutions at the microscale is a major advantage for biochemistry analysis and biological diagnostics. Despite the known potential of miniaturization in biochemistry and biomedical applications, comparatively little is known about fluid automation and control at the microscale. Here, we study the electric field effect inside a fluidic channel and proper electrode structures with different patterns proposed to form forward, reversal, and rotational flows inside the channel. The simulation results confirmed that the ac electro-thermal flow is efficient for the control and automation of high-conductive solutions. In this research, the fluid pumping and mixing effects were numerically studied by solving physic-coupled electric, temperature, hydrodynamic, and concentration fields inside a microchannel. From an experimental point of view, the electrode structures are deposited on a silicon substrate and bonded to a PDMS microchannel to form a microfluidic chip. The motions of fluorescent particles in pumping and mixing modes were captured by using a CCD camera. By measuring the frequency response of the fluid and exciting the electrodes with the proper voltage, the fluid motions (including pumping and mixing effects) are observed inside the channel through the CCD camera. Based on the results, there is good agreement between the experimental and simulation studies.Keywords: microfluidic, nano/micro actuator, AC electrothermal, Reynolds number, micropump, micromixer, microfabrication, mass transfer, biomedical applications
Procedia PDF Downloads 791890 The Synthesis and Analysis of Two Long Lasting Phosphorescent Compounds: SrAl2O4: Eu2+, Dy3+
Authors: Ghayah Alsaleem
Abstract:
This research project focussed on specific compounds, whereas a literature review was completed on the broader subject of long-lasting phosphorescence. For the review and subsequent laboratory work, long lasting phosphorescence compounds were defined as materials that have an afterglow decay time greater than a few minutes. The decay time is defined as the time between the end of excitation and the moment the light intensity drops below 0.32mcd/m2. This definition is widely used in industry and in most research studies. The experimental work focused on known long-lasting phosphorescence compounds – strontium aluminate (SrAl2O4: Eu2+, Dy3+). At first, preparation was similar to literary methods. Temperature, dopant levels and mixing methods were then varied in order to expose their effects on long-lasting phosphorescence. The effect of temperature was investigated for SrAl2O4: Eu2+, Dy3+, and resulted in the discovery that 1350°C was the only temperature that the compound could be heated to in the Differential scanning calorimetry (DSC) in order to achieve any phosphorescence. However, no temperatures above 1350°C were investigated. The variation of mixing method and co-dopant level in the strontium aluminate compounds resulted in the finding that the dry mixing method using a Turbula mixer resulted in the longest afterglow. It was also found that an increase of europium inclusion, from 1mol% to 2mol% in these compounds, increased the brightest of the phosphorescence. As this increased batch was mixed using sonication, the phosphorescent time was actually reduced which produced green long-lasting phosphorescence for up to 20 minutes following 30 minutes excitation and 50 minutes when the europium content was doubled and mixed using sonication.Keywords: long lasting, phosphorescence, excitation, europium
Procedia PDF Downloads 1811889 Rest API Based System-level Test Automation for Mobile Applications
Authors: Jisoo Song
Abstract:
Today’s mobile applications are communicating with servers more and more in order to access external services or information. Also, server-side code changes are more frequent than client-side code changes in a mobile application. The frequent changes lead to an increase in testing cost increase. To reduce costs, UI based test automation can be one of the solutions. It is a common automation technique in system-level testing. However, it can be unsuitable for mobile applications. When you automate tests based on UI elements for mobile applications, there are some limitations such as the overhead of script maintenance or the difficulty of finding invisible defects that UI elements cannot represent. To overcome these limitations, we present a new automation technique based on Rest API. You can automate system-level tests through test scripts that you write. These scripts call a series of Rest API in a user’s action sequence. This technique does not require testers to know the internal implementation details, only input and expected output of Rest API. You can easily modify test cases by modifying Rest API input values and also find problems that might not be evident from the UI level by validating output values. For example, when an application receives price information from a payment server and user cannot see it at UI level, Rest API based scripts can check whether price information is correct or not. More than 10 mobile applications at our company are being tested automatically based on Rest API scripts whenever application source code, mostly server source code, is built. We are finding defects right away by setting a script as a build job in CI server. The build job starts when application code builds are completed. This presentation will also include field cases from our company.Keywords: case studies at SK Planet, introduction of rest API based test automation, limitations of UI based test automation
Procedia PDF Downloads 4461888 Neutral Heavy Scalar Searches via Standard Model Gauge Boson Decays at the Large Hadron Electron Collider with Multivariate Techniques
Authors: Luigi Delle Rose, Oliver Fischer, Ahmed Hammad
Abstract:
In this article, we study the prospects of the proposed Large Hadron electron Collider (LHeC) in the search for heavy neutral scalar particles. We consider a minimal model with one additional complex scalar singlet that interacts with the Standard Model (SM) via mixing with the Higgs doublet, giving rise to an SM-like Higgs boson and a heavy scalar particle. Both scalar particles are produced via vector boson fusion and can be tested via their decays into pairs of SM particles, analogously to the SM Higgs boson. Using multivariate techniques, we show that the LHeC is sensitive to heavy scalars with masses between 200 and 800 GeV down to scalar mixing of order 0.01.Keywords: beyond the standard model, large hadron electron collider, multivariate analysis, scalar singlet
Procedia PDF Downloads 1361887 Pallet Tracking and Cost Optimization of the Flow of Goods in Logistics Operations by Serial Shipping Container Code
Authors: Dominika Crnjac Milic, Martina Martinovic, Vladimir Simovic
Abstract:
The case study method in this paper shows the implementation of Information Technology (IT) and the Serial Shipping Container Code (SSCC) in a Croatian company that deals with logistics operations and provides logistics services in the cold chain segment. This company is aware of the sensitivity of the goods entrusted to them by the user of the service, as well as of the importance of speed and accuracy in providing logistics services. To that end, it has implemented and used the latest IT to ensure the highest standard of high-quality logistics services to its customers. Looking for efficiency and optimization of supply chain management, while maintaining a high level of quality of the products that are sold, today's users of outsourced logistics services are open to the implementation of new IT products that ultimately deliver savings. By analysing the positive results and the difficulties that arise when using this technology, we aim to provide an insight into the potential of this approach of the logistics service provider.Keywords: logistics operations, serial shipping container code, information technology, cost optimization
Procedia PDF Downloads 3591886 Model Driven Architecture Methodologies: A Review
Authors: Arslan Murtaza
Abstract:
Model Driven Architecture (MDA) is technique presented by OMG (Object Management Group) for software development in which different models are proposed and converted them into code. The main plan is to identify task by using PIM (Platform Independent Model) and transform it into PSM (Platform Specific Model) and then converted into code. In this review paper describes some challenges and issues that are faced in MDA, type and transformation of models (e.g. CIM, PIM and PSM), and evaluation of MDA-based methodologies.Keywords: OMG, model driven rrchitecture (MDA), computation independent model (CIM), platform independent model (PIM), platform specific model(PSM), MDA-based methodologies
Procedia PDF Downloads 4571885 Bio Ethanol Production From the Co-Mixture of Jatropha Carcus L. Kernel Cake and Rice Straw
Authors: Felix U. Asoiro, Daniel I. Eleazar, Peter O. Offor
Abstract:
As a result of increasing energy demands, research in bioethanol has increased in recent years all through the world, in abide to partially or totally replace renewable energy supplies. The first and third generation feedstocks used for biofuel production have fundamental drawbacks. Waste rice straw and cake from second generation feedstock like Jatropha curcas l. kernel (JC) is seen as non-food feedstock and promising candidates for the industrial production of bioethanol. In this study, JC and rice husk (RH) wastes were characterized for proximate composition. Bioethanol was produced from the residual polysaccharides present in rice husk (RH) and Jatropha seed cake by sequential hydrolytic and fermentative processes at varying mixing proportions (50 g JC/50 g RH, 100 g JC/10 g RH, 100 g JC/20 g RH, 100 g JC/50 g RH, 100 g JC/100 g RH, 100 g JC/200 g RH and 200 g JC/100 g RH) and particle sizes (0.25, 0.5 and 1.00 mm). Mixing proportions and particle size significantly affected both bioethanol yield and some bioethanol properties. Bioethanol yield (%) increased with an increase in particle size. The highest bioethanol (8.67%) was produced at a mixing proportion of 100 g JC/50g RH at 0.25 mm particle size. The bioethanol had the lowest values of specific gravity and density of 1.25 and 0.92 g cm-3 and the highest values of 1.57 and 0.97 g cm-3 respectively. The highest values of viscosity (4.64 cSt) were obtained with 200 g JC/100 g RH, at 1.00 mm particle size. The maximum flash point and cloud point values were 139.9 oC and 23.7oC (100 g JC/200 g RH) at 1 mm and 0.5 mm particle sizes respectively. The maximum pour point value recorded was 3.85oC (100 g JC/50 g RH) at 1 mm particle size. The paper concludes that bioethanol can be recovered from JC and RH wastes. JC and RH blending proportions as well as particle sizes are important factors in bioethanol production.Keywords: bioethanol, hydrolysis, Jatropha curcas l. kernel, rice husk, fermentation, proximate composition
Procedia PDF Downloads 941884 Municipal Leachate Treatment by Using Polyaluminium Chloride as a Coagulant
Authors: Syeda Azeem Unnisa
Abstract:
The present study was undertaken at Jawaharnagar Solid Waste Municipal Dumpsite, Greater Hyderabad Municipal Corporation, Telangana State, India in 2017 which generates 90,000 litres of leachate per day. The main objective of the leachate treatment was to remove organic compounds like color, suspended solids, ammonia and COD by coagulation-flocculation using polyaluminum chloride (PAC) as coagulant which has higher coagulant efficiency and relative low cost compared to the conventional coagulants. Jar test apparatus was used to conduct experiments for pH 7, rapid mixing speed 150 rpm for 3 minute, slow mixing speed 30 rpm for 20 minute and the settling time of 30 minute for different dosage of PAC (0.5, 1.0, 1.5, 2.0, 2.5, 3.0, 3.5, 4.0, 4.5 and 5.0 g/L). The highest percentage of removal of suspended solids, color, COD and ammonical nitrogen are 97%, 96%, 60% and 37% with PAC optimum dose of 2.0 g/l. The results indicate that the PAC was effective in leachate treatment which is very much suitable for high toxicity of waste and economically feasible for Indian conditions. The treated water can be utilized for other purpose apart from drinking.Keywords: coagulant, leachate, polyaluminium chloride, treatment
Procedia PDF Downloads 2051883 Integrating the Athena Vortex Lattice Code into a Multivariate Design Synthesis Optimisation Platform in JAVA
Authors: Paul Okonkwo, Howard Smith
Abstract:
This paper describes a methodology to integrate the Athena Vortex Lattice Aerodynamic Software for automated operation in a multivariate optimisation of the Blended Wing Body Aircraft. The Athena Vortex Lattice code developed at the Massachusetts Institute of Technology by Mark Drela allows for the aerodynamic analysis of aircraft using the vortex lattice method. Ordinarily, the Athena Vortex Lattice operation requires a text file containing the aircraft geometry to be loaded into the AVL solver in order to determine the aerodynamic forces and moments. However, automated operation will be required to enable integration into a multidisciplinary optimisation framework. Automated AVL operation within the JAVA design environment will nonetheless require a modification and recompilation of AVL source code into an executable file capable of running on windows and other platforms without the –X11 libraries. This paper describes the procedure for the integrating the FORTRAN written AVL software for automated operation within the multivariate design synthesis optimisation framework for the conceptual design of the BWB aircraft.Keywords: aerodynamics, automation, optimisation, AVL, JNI
Procedia PDF Downloads 5821882 A Case Study Report on Acoustic Impact Assessment and Mitigation of the Hyprob Research Plant
Authors: D. Bianco, A. Sollazzo, M. Barbarino, G. Elia, A. Smoraldi, N. Favaloro
Abstract:
The activities, described in the present paper, have been conducted in the framework of the HYPROB-New Program, carried out by the Italian Aerospace Research Centre (CIRA) promoted and funded by the Italian Ministry of University and Research (MIUR) in order to improve the National background on rocket engine systems for space applications. The Program has the strategic objective to improve National system and technology capabilities in the field of liquid rocket engines (LRE) for future Space Propulsion Systems applications, with specific regard to LOX/LCH4 technology. The main purpose of the HYPROB program is to design and build a Propulsion Test Facility (HIMP) allowing test activities on Liquid Thrusters. The development of skills in liquid rocket propulsion can only pass through extensive test campaign. Following its mission, CIRA has planned the development of new testing facilities and infrastructures for space propulsion characterized by adequate sizes and instrumentation. The IMP test cell is devoted to testing articles representative of small combustion chambers, fed with oxygen and methane, both in liquid and gaseous phase. This article describes the activities that have been carried out for the evaluation of the acoustic impact, and its consequent mitigation. The impact of the simulated acoustic disturbance has been evaluated, first, using an approximated method based on experimental data by Baumann and Coney, included in “Noise and Vibration Control Engineering” edited by Vér and Beranek. This methodology, used to evaluate the free-field radiation of jet in ideal acoustical medium, analyzes in details the jet noise and assumes sources acting at the same time. It considers as principal radiation sources the jet mixing noise, caused by the turbulent mixing of jet gas and the ambient medium. Empirical models, allowing a direct calculation of the Sound Pressure Level, are commonly used for rocket noise simulation. The model named after K. Eldred is probably one of the most exploited in this area. In this paper, an improvement of the Eldred Standard model has been used for a detailed investigation of the acoustical impact of the Hyprob facility. This new formulation contains an explicit expression for the acoustic pressure of each equivalent noise source, in terms of amplitude and phase, allowing the investigation of the sources correlation effects and their propagation through wave equations. In order to enhance the evaluation of the facility acoustic impact, including an assessment of the mitigation strategies to be set in place, a more advanced simulation campaign has been conducted using both an in-house code for noise propagation and scattering, and a commercial code for industrial noise environmental impact, CadnaA. The noise prediction obtained with the revised Eldred-based model has then been used for formulating an empirical/BEM (Boundary Element Method) hybrid approach allowing the evaluation of the barrier mitigation effect, at the design. This approach has been compared with the analogous empirical/ray-acoustics approach, implemented within CadnaA using a customized definition of sources and directivity factor. The resulting impact evaluation study is reported here, along with the design-level barrier optimization for noise mitigation.Keywords: acoustic impact, industrial noise, mitigation, rocket noise
Procedia PDF Downloads 1451881 Influence of Deficient Materials on the Reliability of Reinforced Concrete Members
Authors: Sami W. Tabsh
Abstract:
The strength of reinforced concrete depends on the member dimensions and material properties. The properties of concrete and steel materials are not constant but random variables. The variability of concrete strength is due to batching errors, variations in mixing, cement quality uncertainties, differences in the degree of compaction and disparity in curing. Similarly, the variability of steel strength is attributed to the manufacturing process, rolling conditions, characteristics of base material, uncertainties in chemical composition, and the microstructure-property relationships. To account for such uncertainties, codes of practice for reinforced concrete design impose resistance factors to ensure structural reliability over the useful life of the structure. In this investigation, the effects of reductions in concrete and reinforcing steel strengths from the nominal values, beyond those accounted for in the structural design codes, on the structural reliability are assessed. The considered limit states are flexure, shear and axial compression based on the ACI 318-11 structural concrete building code. Structural safety is measured in terms of a reliability index. Probabilistic resistance and load models are compiled from the available literature. The study showed that there is a wide variation in the reliability index for reinforced concrete members designed for flexure, shear or axial compression, especially when the live-to-dead load ratio is low. Furthermore, variations in concrete strength have minor effect on the reliability of beams in flexure, moderate effect on the reliability of beams in shear, and sever effect on the reliability of columns in axial compression. On the other hand, changes in steel yield strength have great effect on the reliability of beams in flexure, moderate effect on the reliability of beams in shear, and mild effect on the reliability of columns in axial compression. Based on the outcome, it can be concluded that the reliability of beams is sensitive to changes in the yield strength of the steel reinforcement, whereas the reliability of columns is sensitive to variations in the concrete strength. Since the embedded target reliability in structural design codes results in lower structural safety in beams than in columns, large reductions in material strengths compromise the structural safety of beams much more than they affect columns.Keywords: code, flexure, limit states, random variables, reinforced concrete, reliability, reliability index, shear, structural safety
Procedia PDF Downloads 4301880 Evaluation of Response Modification Factors in Moment Resisting Frame Buildings Considering Soil Structure Interaction
Authors: K. Farheen, A. Munir
Abstract:
Seismic response of the multi-storey buildings is created by the interaction of both the structure and underlying soil medium. The seismic design philosophy is incorporated using response modification factor 'R'. Current code based values of 'R' factor does not reflect the SSI problem as it is based on fixed base condition. In this study, the modified values of 'R' factor for moment resisting frame (MRF) considering SSI are evaluated. The response of structure with and without SSI has been compared using equivalent linear static and nonlinear static pushover analyses for 10-storied moment resisting frame building. The building is located in seismic zone 2B situated on different soils with shear wave velocity (Vₛ) of 300m/sec (SD) and 1200m/s (SB). Code based 'R' factor value for building frame system has been taken as 5.5. Soil medium is modelled using identical but mutually independent horizontal and vertical springs. It was found that the modified 'R' factor values have been decreased by 47% and 43% for soil SD and SB respectively as compared to that of code based 'R' factor.Keywords: buildings, SSI, shear wave velocity, R factor
Procedia PDF Downloads 2111879 Magnetofluidics for Mass Transfer and Mixing Enhancement in a Micro Scale Device
Authors: Majid Hejazian, Nam-Trung Nguyen
Abstract:
Over the past few years, microfluidic devices have generated significant attention from industry and academia due to advantages such as small sample volume, low cost and high efficiency. Microfluidic devices have applications in chemical, biological and industry analysis and can facilitate assay of bio-materials and chemical reactions, separation, and sensing. Micromixers are one of the important microfluidic concepts. Micromixers can work as stand-alone devices or be integrated in a more complex microfluidic system such as a lab on a chip (LOC). Micromixers are categorized as passive and active types. Passive micromixers rely only on the arrangement of the phases to be mixed and contain no moving parts and require no energy. Active micromixers require external fields such as pressure, temperature, electric and acoustic fields. Rapid and efficient mixing is important for many applications such as biological, chemical and biochemical analysis. Achieving fast and homogenous mixing of multiple samples in the microfluidic devices has been studied and discussed in the literature recently. Improvement in mixing rely on effective mass transport in microscale, but are currently limited to molecular diffusion due to the predominant laminar flow in this size scale. Using magnetic field to elevate mass transport is an effective solution for mixing enhancement in microfluidics. The use of a non-uniform magnetic field to improve mass transfer performance in a microfluidic device is demonstrated in this work. The phenomenon of mixing ferrofluid and DI-water streams has been reported before, but mass transfer enhancement for other non-magnetic species through magnetic field have not been studied and evaluated extensively. In the present work, permanent magnets were used in a simple microfluidic device to create a non-uniform magnetic field. Two streams are introduced into the microchannel: one contains fluorescent dye mixed with diluted ferrofluid to induce enhanced mass transport of the dye, and the other one is a non-magnetic DI-water stream. Mass transport enhancement of fluorescent dye is evaluated using fluorescent measurement techniques. The concentration field is measured for different flow rates. Due to effect of magnetic field, a body force is exerted on the paramagnetic stream and expands the ferrofluid stream into non-magnetic DI-water flow. The experimental results demonstrate that without a magnetic field, both magnetic nanoparticles of the ferrofluid and the fluorescent dye solely rely on molecular diffusion to spread. The non-uniform magnetic field, created by the permanent magnets around the microchannel, and diluted ferrofluid can improve mass transport of non-magnetic solutes in a microfluidic device. The susceptibility mismatch between the fluids results in a magnetoconvective secondary flow towards the magnets and subsequently the mass transport of the non-magnetic fluorescent dye. A significant enhancement in mass transport of the fluorescent dye was observed. The platform presented here could be used as a microfluidics-based micromixer for chemical and biological applications.Keywords: ferrofluid, mass transfer, micromixer, microfluidics, magnetic
Procedia PDF Downloads 2241878 First Formaldehyde Retrieval Using the Raw Data Obtained from Pandora in Seoul: Investigation of the Temporal Characteristics and Comparison with Ozone Monitoring Instrument Measurement
Abstract:
In this present study, for the first time, we retrieved the Formaldehyde (HCHO) Vertical Column Density (HCHOVCD) using Pandora instruments in Seoul, a megacity in northeast Asia, for the period between 2012 and 2014 and investigated the temporal characteristics of HCHOVCD. HCHO Slant Column Density (HCHOSCD) was obtained using the Differential Optical Absorption Spectroscopy (DOAS) method. HCHOSCD was converted to HCHOVCD using geometric Air Mass Factor (AMFG) as Pandora is the direct-sun measurement. The HCHOVCDs is low at 12:00 Local Time (LT) and is high in the morning (10:00 LT) and late afternoon (16:00 LT) except for winter. The maximum (minimum) values of Pandora HCHOVCD are 2.68×1016 (1.63×10¹⁶), 3.19×10¹⁶ (2.23×10¹⁶), 2.00×10¹⁶ (1.26×10¹⁶), and 1.63×10¹⁶ (0.82×10¹⁶) molecules cm⁻² in spring, summer, autumn, and winter, respectively. In terms of seasonal variations, HCHOVCD was high in summer and low in winter which implies that photo-oxidation plays an important role in HCHO production in Seoul. In comparison with the Ozone Monitoring Instrument (OMI) measurements, the HCHOVCDs from the OMI are lower than those from Pandora. The correlation coefficient (R) between monthly HCHOVCDs values from Pandora and OMI is 0.61, with slop of 0.35. Furthermore, to understand HCHO mixing ratio within Planetary Boundary Layer (PBL) in Seoul, we converted Pandora HCHOVCDs to HCHO mixing ratio in the PBL using several meteorological input data from the Atmospheric InfraRed Sounder (AIRS). Seasonal HCHO mixing ratio in PBL converted from Pandora (OMI) HCHOVCDs are estimated to be 6.57 (5.17), 7.08 (6.68), 7.60 (4.70), and 5.00 (4.76) ppbv in spring, summer, autumn, and winter, respectively.Keywords: formaldehyde, OMI, Pandora, remote sensing
Procedia PDF Downloads 1501877 Cross Project Software Fault Prediction at Design Phase
Authors: Pradeep Singh, Shrish Verma
Abstract:
Software fault prediction models are created by using the source code, processed metrics from the same or previous version of code and related fault data. Some company do not store and keep track of all artifacts which are required for software fault prediction. To construct fault prediction model for such company, the training data from the other projects can be one potential solution. The earlier we predict the fault the less cost it requires to correct. The training data consists of metrics data and related fault data at function/module level. This paper investigates fault predictions at early stage using the cross-project data focusing on the design metrics. In this study, empirical analysis is carried out to validate design metrics for cross project fault prediction. The machine learning techniques used for evaluation is Naïve Bayes. The design phase metrics of other projects can be used as initial guideline for the projects where no previous fault data is available. We analyze seven data sets from NASA Metrics Data Program which offer design as well as code metrics. Overall, the results of cross project is comparable to the within company data learning.Keywords: software metrics, fault prediction, cross project, within project.
Procedia PDF Downloads 3431876 Reduction of Multiple User Interference for Optical CDMA Systems Using Successive Interference Cancellation Scheme
Authors: Tawfig Eltaif, Hesham A. Bakarman, N. Alsowaidi, M. R. Mokhtar, Malek Harbawi
Abstract:
In Commonly, it is primary problem that there is multiple user interference (MUI) noise resulting from the overlapping among the users in optical code-division multiple access (OCDMA) system. In this article, we aim to mitigate this problem by studying an interference cancellation scheme called successive interference cancellation (SIC) scheme. This scheme will be tested on two different detection schemes, spectral amplitude coding (SAC) and direct detection systems (DS), using partial modified prime (PMP) as the signature codes. It was found that SIC scheme based on both SAC and DS methods had a potential to suppress the intensity noise, that is to say, it can mitigate MUI noise. Furthermore, SIC/DS scheme showed much lower bit error rate (BER) performance relative to SIC/SAC scheme for different magnitude of effective power. Hence, many more users can be supported by SIC/DS receiver system.Keywords: optical code-division multiple access (OCDMA), successive interference cancellation (SIC), multiple user interference (MUI), spectral amplitude coding (SAC), partial modified prime code (PMP)
Procedia PDF Downloads 5201875 Impact of Flavor on Food Product Quality, A Case Study of Vanillin Stability during Biscuit Preparation
Authors: N. Yang, R. Linforth, I. Fisk
Abstract:
The influence of food processing and choice of flavour solvent was investigated using biscuits prepared with vanillin flavour as an example. Powder vanillin either was added directly into the dough or dissolved into flavour solvent then mixed into the dough. The impact of two commonly used flavour solvents on food quality was compared: propylene glycol (PG) or triacetin (TA). The analytical approach for vanillin detection was developed by chromatography (HPLC-PDA), and the standard extraction method for vanillin was also established. The results indicated the impact of solvent choice on vanillin level during biscuit preparation. After baking, TA as a more heat resistant solvent retained more vanillin than PG, so TA is a better solvent for products that undergo a heating process. The results also illustrated the impact of mixing and baking on vanillin stability in the matrices. The average loss of vanillin was 33% during mixing and 13% during baking, which indicated that the binding of vanillin to fat or flour before baking might cause larger loss than evaporation loss during baking.Keywords: biscuit, flavour stability, food quality, vanillin
Procedia PDF Downloads 5081874 Cognitive Function During the First Two Hours of Spravato Administration in Patients with Major Depressive Disorder
Authors: Jocelyn Li, Xiangyang Li
Abstract:
We have employed THINC-it® to study the acute effects of Spravato on the cognitive function of patients with severe major depression disorder (MDD). The scores of the four tasks (Spotter, Symbol Check, Code Breaker, Trails) found in THINC-it® were used to measure cognitive function throughout treatment. The patients who participated in this study have tried more than 3 antidepressants without significant improvement before they began Spravato treatment. All patients received 3 doses of 28 mg Spravato 5 minutes apart (84 mg total per treatment) during this study with THINC-it®. The data were collected before the first Spravato administration (T0), 1 hour after the first Spravato administration (T1), and 2 hours after the first Spravato administration (T2) during each treatment. The following data were from 13 patients, with a total of 226 trials in a 2-3 month period. Spravato at 84 mg reduced the scores of Trails, Code Breaker, Symbol Check, and Spotter at T1 by 10-20% in all patients with one exception for a minority of patients in Spotter. At T2, the scores of Trails, Symbol Check, and Spotter were back to 97% of T0 while the score of Code Breaker was back to 92%. Interestingly, we found that the score of Spotter was consistently increased by 17% at T1 in the same 30% of patients in each treatment. We called this change reverse response while the pattern of the other patients, a decline (T1) and then recovery (T2), was called non-reverse response. We also compared the scores at T0 between the first visit and the fifth visit. The T0 scores of all four tasks were improved at visit 5 when compared to visit 1. The scores of Trails, Code Breaker, and Symbol Check at T0 were increased by 14%, 33%, and 14% respectively at visit 5. The score of Code Breaker, which had two trends, improved by 9% in reverse response patients compared to a 27% improvement in non-reverse response patients. To our knowledge, this is the first study done on the impact of Spravato on cognitive function change in major depression patients at this time frame. Whether we can predict future responses to Spravato with THINC-it® merits further study.Keywords: Spravato, THINC-it, major depressive disorder, cognitive function
Procedia PDF Downloads 1151873 Multisignature Schemes for Reinforcing Trust in Cloud Software-As-A-Service Services
Authors: Mustapha Hedabou, Ali Azougaghe, Ahmed Bentajer, Hicham Boukhris, Mourad Eddiwani, Zakaria Igarramen
Abstract:
Software-as-a-service (SaaS) is emerging as a dominant approach to delivering software. It encompasses a range of business, technical opportunities, issue, and challenges. Trustiness in the cloud services regarding the security and the privacy of the delivered data is the most critical issue with the SaaS model. In this paper, we survey the security concerns related to the SaaS model, and we propose the design of a trusted SaaS model that gives users more confidence into SaaS services by leveraging a trust in a neutral source code certifying authority. The proposed design is based on the use of the multisignature mechanism for signing the source code of the application service. In our model, the cloud provider acts as a root of trust by ensuring the integrity of the application service when it was running on its platform. The proposed design prevents insider attacks from tampering with application service before and after it was launched in a cloud provider platform.Keywords: cloud computing, SaaS Platform, TPM, trustiness, code source certification, multi-signature schemes
Procedia PDF Downloads 2741872 Secure Distance Bounding Protocol on Ultra-WideBand Based Mapping Code
Authors: Jamel Miri, Bechir Nsiri, Ridha Bouallegue
Abstract:
Ultra WidBand-IR physical layer technology has seen a great development during the last decade which makes it a promising candidate for short range wireless communications, as they bring considerable benefits in terms of connectivity and mobility. However, like all wireless communication they suffer from vulnerabilities in terms of security because of the open nature of the radio channel. To face these attacks, distance bounding protocols are the most popular counter measures. In this paper, we presented a protocol based on distance bounding to thread the most popular attacks: Distance Fraud, Mafia Fraud and Terrorist fraud. In our work, we study the way to adapt the best secure distance bounding protocols to mapping code of ultra-wideband (TH-UWB) radios. Indeed, to ameliorate the performances of the protocol in terms of security communication in TH-UWB, we combine the modified protocol to ultra-wideband impulse radio technology (IR-UWB). The security and the different merits of the protocols are analyzed.Keywords: distance bounding, mapping code ultrawideband, terrorist fraud, physical layer technology
Procedia PDF Downloads 2981871 Development of CaO-based Sorbents Applied to Sorption Enhanced Steam Reforming Processes
Authors: P. Comendador, I. Garcia, S. Orozco, L. Santamaria, M. Amutio, G. Lopez, M. Olazar
Abstract:
In situ CO₂ capture in steam reforming processes has been studied in the last years as an alternative for increasing H₂ yields and H₂ purity in the product stream. For capturing the CO₂ at the reforming conditions, CaO-based sorbents are usually employed due to their properties at high temperature, low cost and high availability. However, the challenge is to develop high-capacity (gCO₂/gsorbent) materials that retain their capacity over cycles of operation. Besides, since the objective is to capture the CO₂ generated in situ, another key aspect is the sorption dynamics, which means that, in order to efficiently use the sorbent, it has to capture the CO₂ at a rate equal to or higher than the generation rate. In this work, different CaO-based materials have been prepared to aim at meeting these criteria. First, and by using the wet mixing method, different inert materials (Mg, Ce and Al) were combined with CaO. Second, and with the inert material selected (Mg), the effect of its concentration in the final material was studied. Transversally, the calcination temperature was also evaluated. It was determined that the wet mixing method is a simple procedure suitable for the preparation of CaO sorbents mixed with inert materials. The materials prepared by mixing the CaO with Mg have shown satisfactory anti-sintering properties and adequate sorption kinetics for their application in steam reforming processes. Regarding the concentration of Mg in the solid, it was concluded that high values contribute to the stability but at the expense of losing sorption capacity. Finally, it was observed that high calcination temperatures negatively affected the sorption properties of the final materials due to the decrease in the pore volume and the specific surface area.Keywords: calcination temperature effect, CO₂ capture, Mg-Ce-Al stabilizers, Mg varying concentration effect, Sorbent stabilization
Procedia PDF Downloads 791870 Hydrodynamics of Shear Layers at River Confluences by Formation of Secondary Circulation
Authors: Ali Aghazadegan, Ali Shokri, Julia Mullarney
Abstract:
River confluences are areas where there is a lot of mixing, which is often caused by the formation of shear layers and helical motions. The hydrodynamics of secondary circulation at river confluences with low flow discharge ratios and a 90° junction angle are investigated in this study. The analysis is based on Delft 3D modelling, which includes a three-dimensional time-averaged velocity field, turbulence, and water surface levels that have been validated using laboratory data. Confluence structure was characterized by shear layer, secondary circulation, and mixing at the junction and post confluence channel. This study analysis formation of the shear layer by generation of secondary circulations in variation discharge ratios. The values of streamwise, cross-wise, and vertical components are used to estimate the secondary circulation observed within and downstream of the tributary mouth. These variables are estimated for three horizontal planes at Z = [0.14; 0.07; 0.02] and for eight cross-sections at X = [-0.1; 0.00; 0.10; 0.2; 0.30; 0.4; 0.5; 0.6] within a range of 0.05 Y 0.30.Keywords: river confluence, shear layer, secondary circulation, hydrodynamics
Procedia PDF Downloads 951869 Study on the Thermal Mixing of Steam and Coolant in the Hybrid Safety Injection Tank
Authors: Sung Uk Ryu, Byoung Gook Jeon, Sung-Jae Yi, Dong-Jin Euh
Abstract:
In such passive safety injection systems in the nuclear power plant as Core Makeup Tank (CMT) and Hybrid Safety Injection Tank, various thermal-hydraulic phenomena including the direct contact condensation of steam and the thermal stratification of coolant occur. These phenomena are also closely related to the performance of the system. Depending on the condensation rate of the steam injected to the tank, the injection of the coolant and pressure equalizing timings of the tank are decided. The steam injected to the tank from the upper nozzle penetrates the coolant and induces a direct contact condensation. In the present study, the direct contact condensation of steam and the thermal mixing between the steam and coolant were examined by using the Particle Image Velocimetry (PIV) technique. Especially, by altering the size of the nozzle from which the steam is injected, the influence of steam injection velocity on the thermal mixing with coolant and condensation shall be comprehended, while also investigating the influence of condensation on the pressure variation inside the tank. Even though the amounts of steam inserted were the same in three different nozzle size conditions, it was found that the velocity of pressure rise becomes lower as the steam injection area decreases. Also, as the steam injection area increases, the thickness of the zone within which the coolant’s temperature decreases. Thereby, the amount of steam condensed by the direct contact condensation also decreases. The results derived from the present study can be utilized for the detailed design of a passive safety injection system, as well as for modeling the direct contact condensation triggered by the steam jet’s penetration into the coolant.Keywords: passive safety injection systems, steam penetration, direct contact condensation, particle image velocimetry
Procedia PDF Downloads 3941868 Used MATLAB Code to Study the Vehicle Bridge Coupling Vibration Based On the Method of Newmark-β
Authors: Saidi Abdelkrim, Hamouine Abdelmadjid, Abdellatif Megnounif
Abstract:
The study of interaction between vehicles and bridge structures has become extremely important. Large deflections and vibration induced by heavy and high-speed vehicles affect significantly the safety and efficiency of bridge. The vibration of a bridge caused by passage of vehicles is one of the most imperative considerations in the design of a bridge as a common sort of transportation structure. A major goal of this study is to create a simplified model of a vehicle bridge system in MATLAB. The model will then be used to study the influence of parameters to vehicle-bridge vibrations.Keywords: vehicle-bridge interaction, Newmark-β, MATLAB code
Procedia PDF Downloads 6151867 Modification of the Athena Vortex Lattice Code for the Multivariate Design Synthesis Optimisation of the Blended Wing Body Aircraft
Authors: Paul Okonkwo, Howard Smith
Abstract:
This paper describes a methodology to integrate the Athena Vortex Lattice Aerodynamic Software for automated operation in a multivariate optimisation of the Blended Wing Body Aircraft. The Athena Vortex Lattice code developed at the Massachusetts Institute of Technology by Mark Drela allows for the aerodynamic analysis of aircraft using the vortex lattice method. Ordinarily, the Athena Vortex Lattice operation requires a text file containing the aircraft geometry to be loaded into the AVL solver in order to determine the aerodynamic forces and moments. However, automated operation will be required to enable integration into a multidisciplinary optimisation framework. Automated AVL operation within the JAVA design environment will nonetheless require a modification and recompilation of AVL source code into an executable file capable of running on windows and other platforms without the –X11 libraries. This paper describes the procedure for the integrating the FORTRAN written AVL software for automated operation within the multivariate design synthesis optimisation framework for the conceptual design of the BWB aircraft.Keywords: aerodynamics, automation, optimisation, AVL
Procedia PDF Downloads 6561866 Transmission of ASCII Code Messages Using a High Power (50mW) Underwater Laser Communication Prototype in Two Controlled Scenarios
Authors: Lessly Borja, Anthony Gualli, Kelly Baño, Fabricio Santacruz
Abstract:
In this article, a prototype of underwater communication using a long-range laser (50mW) has been carried out in two aquatic scenarios (fish tank and swimming pool) with the aim of recreating Aqua-Fi technology (the future of underwater communications) using a Bluetooth connection to the transmitter to send data in ASCII code by means of light. Initially, the transmitter and receiver circuits were programmed in Arduino so that the data would travel by light pulses in the aforementioned code. To obtain the results of the underwater communication, two scenarios were chosen (fish tank and swimming pool), where the power value of the received signal was calculated from its peak-to-peak voltage using the Oscilloscope equipment (ESPOCH). Finally, it was concluded that the maximum communication range of this prototype is 12m underwater, and it was observed that the power decreases as the distance increases. However, this prototype still needs to improve communication so that the information is not distorted or lost when there is movement and dispersion of the water. It is hoped that it will form the basis for future research.Keywords: prototype, underwater, communication, power, voltage, distance
Procedia PDF Downloads 871865 Approach to Study the Workability of Concrete with the Fractal Model
Authors: Achouri Fatima, Chouicha Kaddour
Abstract:
The main parameters affecting the workability are the water content, particle size, and the total surface of the grains, as long as the mixing water begins by wetting the surface of the grains and then fills the voids between the grains to form entrapped water, the quantity of water remaining is called free water. The aim is to undertake a fractal approach through the relationship between the concrete formulation parameters and workability, to develop this approach a series of concrete taken from the literature was investigated by varying formulation parameters such as G / S, the quantity of cement C and the quantity of mixing water E. We also call on other model as the model for the thickness of the water layer and model of the thickness of the paste layer to judge their relevance, hence the following results : the relevance of the model of the thickness of the water layer is considered relevant when there is a variation in the water quantity, the model of the thickness of the layer of the paste is only applicable if we consider that the paste is made with the grain value Dmax = 2.85: value from which we see a stable model.Keywords: concrete, fractal method, paste thickness, water thickness, workability
Procedia PDF Downloads 3781864 Cooperative CDD scheme Based on Adaptive Modulation in Wireless Communiation System
Authors: Seung-Jun Yu, Hwan-Jun Choi, Hyoung-Kyu Song
Abstract:
Among spatial diversity scheme, orthogonal space-time block code (OSTBC) and cyclic delay diversity (CDD) have been widely studied for the cooperative wireless relaying system. However, conventional OSTBC and CDD cannot cope with change in the number of relays owing to low throughput or error performance. In this paper, we propose a cooperative cyclic delay diversity (CDD) scheme that use hierarchical modulation at the source and adaptive modulation based on cyclic redundancy check (CRC) code at the relays.Keywords: adaptive modulation, cooperative communication, CDD, OSTBC
Procedia PDF Downloads 431