Search results for: code blue response time
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 22923

Search results for: code blue response time

14763 CFD Simulation of Spacer Effect on Turbulent Mixing Phenomena in Sub Channels of Boiling Nuclear Assemblies

Authors: Shashi Kant Verma, S. L. Sinha, D. K. Chandraker

Abstract:

Numerical simulations of selected subchannel tracer (Potassium Nitrate) based experiments have been performed to study the capabilities of state-of-the-art of Computational Fluid Dynamics (CFD) codes. The Computational Fluid Dynamics (CFD) methodology can be useful for investigating the spacer effect on turbulent mixing to predict turbulent flow behavior such as Dimensionless mixing scalar distributions, radial velocity and vortices in the nuclear fuel assembly. A Gibson and Launder (GL) Reynolds stress model (RSM) has been selected as the primary turbulence model to be applied for the simulation case as it has been previously found reasonably accurate to predict flows inside rod bundles. As a comparison, the case is also simulated using a standard k-ε turbulence model that is widely used in industry. Despite being an isotropic turbulence model, it has also been used in the modeling of flow in rod bundles and to produce lateral velocities after thorough mixing of coolant fairly. Both these models have been solved numerically to find out fully developed isothermal turbulent flow in a 30º segment of a 54-rod bundle. Numerical simulation has been carried out for the study of natural mixing of a Tracer (Passive scalar) to characterize the growth of turbulent diffusion in an injected sub-channel and, afterwards on, cross-mixing between adjacent sub-channels. The mixing with water has been numerically studied by means of steady state CFD simulations with the commercial code STAR-CCM+. Flow enters into the computational domain through the mass inflow at the three subchannel faces. Turbulence intensity and hydraulic diameter of 1% and 5.9 mm respectively were used for the inlet. A passive scalar (Potassium nitrate) is injected through the mass fraction of 5.536 PPM at subchannel 2 (Upstream of the mixing section). Flow exited the domain through the pressure outlet boundary (0 Pa), and the reference pressure was 1 atm. Simulation results have been extracted at different locations of the mixing zone and downstream zone. The local mass fraction shows uniform mixing. The effect of the applied turbulence model is nearly negligible just before the outlet plane because the distributions look like almost identical and the flow is fully developed. On the other hand, quantitatively the dimensionless mixing scalar distributions change noticeably, which is visible in the different scale of the colour bars.

Keywords: single-phase flow, turbulent mixing, tracer, sub channel analysis

Procedia PDF Downloads 196
14762 A Simulation-Optimization Approach to Control Production, Subcontracting and Maintenance Decisions for a Deteriorating Production System

Authors: Héctor Rivera-Gómez, Eva Selene Hernández-Gress, Oscar Montaño-Arango, Jose Ramon Corona-Armenta

Abstract:

This research studies the joint production, maintenance and subcontracting control policy for an unreliable deteriorating manufacturing system. Production activities are controlled by a derivation of the Hedging Point Policy, and given that the system is subject to deterioration, it reduces progressively its capacity to satisfy product demand. Multiple deterioration effects are considered, reflected mainly in the quality of the parts produced and the reliability of the machine. Subcontracting is available as support to satisfy product demand; also overhaul maintenance can be conducted to reduce the effects of deterioration. The main objective of the research is to determine simultaneously the production, maintenance and subcontracting rate which minimize the total incurred cost. A stochastic dynamic programming model is developed and solved through a simulation-based approach composed of statistical analysis and optimization with the response surface methodology. The obtained results highlight the strong interactions between production, deterioration and quality which justify the development of an integrated model. A numerical example and a sensitivity analysis are presented to validate our results.

Keywords: subcontracting, optimal control, deterioration, simulation, production planning

Procedia PDF Downloads 569
14761 Jewish Law in the State of Israel: Law, Religion and State

Authors: Yuval Sinai

Abstract:

As part of the historical, religious and cultural heritage of the Jewish people, Jewish law is part of the legal system in Israel, which is a Jewish and democratic state. The proper degree of use of Jewish law in judicial decisions is an issue that crops up in Israeli law from time to time. This was a burning question in the 1980s in the wake of the enactment of the Foundations of Law Act 1980, which declared Jewish heritage a supplementary legal method to Israeli law. The enactment of the Basic Law: Human Dignity and Liberty 1992, which decreed that the basic Israeli legal principles must be interpreted in light of the values of a Jewish and democratic state, marks a significant change in the impact of Judaism in the law created and applied by the courts. Both of these legislative developments revived the initiative to grant a central status to Jewish law within the state law. How should Jewish law be applied in Israel’s secular courts? This is not a simple question. It is not merely a question of identifying the relevant rule of Jewish law or tracing its development from the Talmud to modern times. Nor is it the same as asking how a rabbinic court would handle the issue. It is a matter of delicate judgment to distill out of the often conflicting Jewish law sources a rule that will fit into the existing framework of Israeli law so as to advance a policy that will best promote the interests of Israel’s society. We shall point out the occasional tensions between Jewish religious law and secular law, and introduce opinions as to how reconciliation of the two can best be achieved in light of Jewish legal tradition and in light of the reality in the modern State of Israel.

Keywords: law and religion, israel, jewish law, law and society

Procedia PDF Downloads 54
14760 Photocatalytic Degradation of Phenolic Compounds in Wastewater Using Magnetically Recoverable Catalyst

Authors: Ahmed K. Sharaby, Ahmed S. El-Gendy

Abstract:

Phenolic compounds (PCs) exist in the wastewater effluents of some industries such as oil refinery, pharmaceutical and cosmetics. Phenolic compounds are extremely hazardous pollutants that can cause severe problems to the aquatic life and human beings if disposed of without treatment. One of the most efficient treatment methods of PCs is photocatalytic degradation. The current work studies the performance of composite nanomaterial of titanium dioxide with magnetite as a photo-catalyst in the degradation of PCs. The current work aims at optimizing the synthesized photocatalyst dosage and contact time as part of the operational parameters at different initial concentrations of PCs and pH values in the wastewater. The study was performed in a lab-scale batch reactor under fixed conditions of light intensity and aeration rate. The initial concentrations of PCs and the pH values were in the range of (10-200 mg/l) and (3-9), respectively. Results of the study indicate that the dosage of the catalyst and contact time for total mineralization is proportional to the initial concentrations of PCs, while the optimum pH conditions for highly efficient degradation is at pH 3. Exceeding the concentration levels of the catalyst beyond certain limits leads to the decrease in the degradation efficiency due to the dissipation of light. The performance of the catalyst for degradation was also investigated in comparison to the pure TiO2 Degussa (P-25). The dosage required for the synthesized catalyst for photocatalytic degradation was approximately 1.5 times that needed from the pure titania.

Keywords: industrial, optimization, phenolic compounds, photocatalysis, wastewater

Procedia PDF Downloads 306
14759 Exploitation of Variability for Salinity Tolerance in Maize Hybrids (Zea Mays L.) at Early Growth Stage

Authors: Abdul Qayyum, Hafiz Muhammad Saeed, Mamoona Hanif, Etrat Noor, Waqas Malik, Shoaib Liaqat

Abstract:

Salinity is extremely serious problem that has a drastic effect on maize crop, environment and causes economic losses of country. An advance technique to overcome salinity is to develop salt tolerant geno types which require screening of huge germplasm to start a breeding program. Therefore, present study was undertaken to screen out 25 maize hybrids of different origin for salinity tolerance at seedling stage under three levels of salt stress 250 and 300 mM NaCl including one control. The existence of variation for tolerance to enhanced NaCl salinity levels at seedling stage in maize proved that hybrids had differing ability to grow under saline environment and potential variability within specie. Almost all the twenty five maize hybrids behaved varyingly in response to different salinity levels. However, the maize hybrids H6, H13, H21, H23 and H24 expressed better performance under salt stress in terms of all six characters and proved to be as highly tolerant while H22, H17 H20, H18, H4, H9, and H8 were identified as moderately tolerant. Hybrids H14, H5, H11 and H3 H12, H2, were expressed as most sensitive to salinity suggesting that screening is an effective tool to exploit genetic variation among maize hybrids and salt tolerance in maize can be enhanced through selection and breeding procedure.

Keywords: salinity, hybrids, maize, variation

Procedia PDF Downloads 697
14758 Totally Robotic Gastric Bypass Using Modified Lonroth Technique

Authors: Arun Prasad

Abstract:

Background: Robotic Bariatric Surgery is a good option for the super obese where laparoscopy demands challenging technical skills. Gastric bypass can be difficult due to inability of the robot to work in two quadrants at the same time. Lonroth technique of gastric bypass involves a totally supracolic surgery where all anastomosis are done in one quadrant only. Methods: We have done 78 robotic gastric bypass surgeries using the modified Lonroth technique. The robot is docked above the head of the patient in the midline. Camera port is placed supra umbilically. Two ports are placed on the left side of the patient and one port on the right side of the patient. An assistant port is placed between the camera port and right sided robotic port for use of stapler. Gastric pouch is made first followed by the gastrojejunostomy that is a four layered sutured anastomosis. Jejuno jejunostomy is then performed followed by a leak test and then the jejunum is divided. A 150 cm biliopancreatic limb and a 75 cm alimentary limb are finally obtained. Mesenteric and Petersen’s defects are then closed. Results: All patients had a successful robotic procedure. Mean time taken in the first 5 cases was 130 minutes. This reduced to a mean of 95 minutes in the last five cases. There were no intraoperative or post operative complications. Conclusions: While a hybrid technique of partly laparoscopic and partly robotic gastric bypass has been done at many centres, we feel using the modified Lonroth technique, a totally robotic gastric bypass surgery fully utilizes the potential of robotic bariatric surgery.

Keywords: robot, bariatric, totally robotic, gastric bypass

Procedia PDF Downloads 241
14757 Wideband Performance Analysis of C-FDTD Based Algorithms in the Discretization Impoverishment of a Curved Surface

Authors: Lucas L. L. Fortes, Sandro T. M. Gonçalves

Abstract:

In this work, it is analyzed the wideband performance with the mesh discretization impoverishment of the Conformal Finite Difference Time-Domain (C-FDTD) approaches developed by Raj Mittra, Supriyo Dey and Wenhua Yu for the Finite Difference Time-Domain (FDTD) method. These approaches are a simple and efficient way to optimize the scattering simulation of curved surfaces for Dielectric and Perfect Electric Conducting (PEC) structures in the FDTD method, since curved surfaces require dense meshes to reduce the error introduced due to the surface staircasing. Defined, on this work, as D-FDTD-Diel and D-FDTD-PEC, these approaches are well-known in the literature, but the improvement upon their application is not quantified broadly regarding wide frequency bands and poorly discretized meshes. Both approaches bring improvement of the accuracy of the simulation without requiring dense meshes, also making it possible to explore poorly discretized meshes which bring a reduction in simulation time and the computational expense while retaining a desired accuracy. However, their applications present limitations regarding the mesh impoverishment and the frequency range desired. Therefore, the goal of this work is to explore the approaches regarding both the wideband and mesh impoverishment performance to bring a wider insight over these aspects in FDTD applications. The D-FDTD-Diel approach consists in modifying the electric field update in the cells intersected by the dielectric surface, taking into account the amount of dielectric material within the mesh cells edges. By taking into account the intersections, the D-FDTD-Diel provides accuracy improvement at the cost of computational preprocessing, which is a fair trade-off, since the update modification is quite simple. Likewise, the D-FDTD-PEC approach consists in modifying the magnetic field update, taking into account the PEC curved surface intersections within the mesh cells and, considering a PEC structure in vacuum, the air portion that fills the intersected cells when updating the magnetic fields values. Also likewise to D-FDTD-Diel, the D-FDTD-PEC provides a better accuracy at the cost of computational preprocessing, although with a drawback of having to meet stability criterion requirements. The algorithms are formulated and applied to a PEC and a dielectric spherical scattering surface with meshes presenting different levels of discretization, with Polytetrafluoroethylene (PTFE) as the dielectric, being a very common material in coaxial cables and connectors for radiofrequency (RF) and wideband application. The accuracy of the algorithms is quantified, showing the approaches wideband performance drop along with the mesh impoverishment. The benefits in computational efficiency, simulation time and accuracy are also shown and discussed, according to the frequency range desired, showing that poorly discretized mesh FDTD simulations can be exploited more efficiently, retaining the desired accuracy. The results obtained provided a broader insight over the limitations in the application of the C-FDTD approaches in poorly discretized and wide frequency band simulations for Dielectric and PEC curved surfaces, which are not clearly defined or detailed in the literature and are, therefore, a novelty. These approaches are also expected to be applied in the modeling of curved RF components for wideband and high-speed communication devices in future works.

Keywords: accuracy, computational efficiency, finite difference time-domain, mesh impoverishment

Procedia PDF Downloads 114
14756 Numerical Modelling of Dust Propagation in the Atmosphere of Tbilisi City in Case of Western Background Light Air

Authors: N. Gigauri, V. Kukhalashvili, A. Surmava, L. Intskirveli, L. Gverdtsiteli

Abstract:

Tbilisi, a large city of the South Caucasus, is a junction point connecting Asia and Europe, Russia and republics of the Asia Minor. Over the last years, its atmosphere has been experienced an increasing anthropogenic load. Numerical modeling method is used for study of Tbilisi atmospheric air pollution. By means of 3D non-linear non-steady numerical model a peculiarity of city atmosphere pollution is investigated during background western light air. Dust concentration spatial and time changes are determined. There are identified the zones of high, average and less pollution, dust accumulation areas, transfer directions etc. By numerical modeling, there is shown that the process of air pollution by the dust proceeds in four stages, and they depend on the intensity of motor traffic, the micro-relief of the city, and the location of city mains. In the interval of time 06:00-09:00 the intensive growth, 09:00-15:00 a constancy or weak decrease, 18:00-21:00 an increase, and from 21:00 to 06:00 a reduction of the dust concentrations take place. The highly polluted areas are located in the vicinity of the city center and at some peripherical territories of the city, where the maximum dust concentration at 9PM is equal to 2 maximum allowable concentrations. The similar investigations conducted in case of various meteorological situations will enable us to compile the map of background urban pollution and to elaborate practical measures for ambient air protection.

Keywords: air pollution, dust, numerical modeling, urban

Procedia PDF Downloads 170
14755 Insect Inducible Methanol Production in Plants for Insect Resistance

Authors: Gourav Jain, Sameer Dixit, Surjeet Kumar Arya, Praveen C. Verma

Abstract:

Plant cell wall plays a major role in defence mechanism against biotic and abiotic stress as it constitutes the physical barrier between the microenvironment and internal component of the cell. It is a complex structure composed of mostly carbohydrates among which cellulose and hemicelluloses are most abundant that is embedded in a matrix of pectins and proteins. Multiple enzymes have been reported which plays a vital role in cell wall modification, Pectin Methylesterase (PME) is one of them which catalyses the demethylesterification of homogalacturonans component of pectin which releases acidic pectin and methanol. As emitted methanol is toxic to the insect pest, we use PME gene for the better methanol production. In the current study we showed overexpression of PME gene isolated from Withania somnifera under the insect inducible promoter causes enhancement of methanol production at the time of insect feeds to plants, and that provides better insect resistance property. We found that the 85-90% mortality causes by transgenic tobacco in both chewing (Spodoptera litura larvae and Helicoverpa armigera) and sap-sucking (Aphid, mealybug, and whitefly) pest. The methanol content and emission level were also enhanced by 10-15 folds at different inducible time point interval (15min, 30min, 45min, 60min) which would be analysed by Purpald/Alcohol Oxidase method.

Keywords: methanol, Pectin methylesterase, inducible promoters, Purpald/Alcohol oxidase

Procedia PDF Downloads 230
14754 Statistical Correlation between Logging-While-Drilling Measurements and Wireline Caliper Logs

Authors: Rima T. Alfaraj, Murtadha J. Al Tammar, Khaqan Khan, Khalid M. Alruwaili

Abstract:

OBJECTIVE/SCOPE (25-75): Caliper logging data provides critical information about wellbore shape and deformations, such as stress-induced borehole breakouts or washouts. Multiarm mechanical caliper logs are often run using wireline, which can be time-consuming, costly, and/or challenging to run in certain formations. To minimize rig time and improve operational safety, it is valuable to develop analytical solutions that can estimate caliper logs using available Logging-While-Drilling (LWD) data without the need to run wireline caliper logs. As a first step, the objective of this paper is to perform statistical analysis using an extensive datasetto identify important physical parameters that should be considered in developing such analytical solutions. METHODS, PROCEDURES, PROCESS (75-100): Caliper logs and LWD data of eleven wells, with a total of more than 80,000 data points, were obtained and imported into a data analytics software for analysis. Several parameters were selected to test the relationship of the parameters with the measured maximum and minimum caliper logs. These parameters includegamma ray, porosity, shear, and compressional sonic velocities, bulk densities, and azimuthal density. The data of the eleven wells were first visualized and cleaned.Using the analytics software, several analyses were then preformed, including the computation of Pearson’s correlation coefficients to show the statistical relationship between the selected parameters and the caliper logs. RESULTS, OBSERVATIONS, CONCLUSIONS (100-200): The results of this statistical analysis showed that some parameters show good correlation to the caliper log data. For instance, the bulk density and azimuthal directional densities showedPearson’s correlation coefficients in the range of 0.39 and 0.57, which wererelatively high when comparedto the correlation coefficients of caliper data with other parameters. Other parameters such as porosity exhibited extremely low correlation coefficients to the caliper data. Various crossplots and visualizations of the data were also demonstrated to gain further insights from the field data. NOVEL/ADDITIVE INFORMATION (25-75): This study offers a unique and novel look into the relative importance and correlation between different LWD measurements and wireline caliper logs via an extensive dataset. The results pave the way for a more informed development of new analytical solutions for estimating the size and shape of the wellbore in real-time while drilling using LWD data.

Keywords: LWD measurements, caliper log, correlations, analysis

Procedia PDF Downloads 105
14753 Conflation Methodology Applied to Flood Recovery

Authors: Eva L. Suarez, Daniel E. Meeroff, Yan Yong

Abstract:

Current flooding risk modeling focuses on resilience, defined as the probability of recovery from a severe flooding event. However, the long-term damage to property and well-being by nuisance flooding and its long-term effects on communities are not typically included in risk assessments. An approach was developed to address the probability of recovering from a severe flooding event combined with the probability of community performance during a nuisance event. A consolidated model, namely the conflation flooding recovery (&FR) model, evaluates risk-coping mitigation strategies for communities based on the recovery time from catastrophic events, such as hurricanes or extreme surges, and from everyday nuisance flooding events. The &FR model assesses the variation contribution of each independent input and generates a weighted output that favors the distribution with minimum variation. This approach is especially useful if the input distributions have dissimilar variances. The &FR is defined as a single distribution resulting from the product of the individual probability density functions. The resulting conflated distribution resides between the parent distributions, and it infers the recovery time required by a community to return to basic functions, such as power, utilities, transportation, and civil order, after a flooding event. The &FR model is more accurate than averaging individual observations before calculating the mean and variance or averaging the probabilities evaluated at the input values, which assigns the same weighted variation to each input distribution. The main disadvantage of these traditional methods is that the resulting measure of central tendency is exactly equal to the average of the input distribution’s means without the additional information provided by each individual distribution variance. When dealing with exponential distributions, such as resilience from severe flooding events and from nuisance flooding events, conflation results are equivalent to the weighted least squares method or best linear unbiased estimation. The combination of severe flooding risk with nuisance flooding improves flood risk management for highly populated coastal communities, such as in South Florida, USA, and provides a method to estimate community flood recovery time more accurately from two different sources, severe flooding events and nuisance flooding events.

Keywords: community resilience, conflation, flood risk, nuisance flooding

Procedia PDF Downloads 88
14752 Importance of an E-Learning Program in Stress Field for Postgraduate Courses of Doctors

Authors: Ramona-Niculina Jurcau, Ioana-Marieta Jurcau

Abstract:

Background: Preparing in the stress field (SF) is, increasingly, a concern for doctors of different specialties. Aims: The aim was to evaluate the importance of an e-learning program for doctors postgraduate courses, in SF. Methods: Doctors (n= 40 male, 40 female) of different specialties and ages (31-71 years), who attended postgraduate courses in SF, voluntarily responded to a questionnaire that included the following themes: Importance of SF courses for specialty practiced by each respondent doctor (using visual analogue scale, VAS); What SF themes would be indicated as e-learning (EL); Preferred form of SF information assimilation: Classical lectures (CL), EL or a combination of these methods (CL+EL); Which information on the SF course are facilitated by EL model versus CL; In their view which are the first four advantages and the first four disadvantages of EL compared to CL, for SF. Results: To most respondents, the SF courses are important for the specialty they practiced (VAS by an average of 4). The SF themes suggested to be done as EL were: Stress mechanisms; stress factor models for different medical specialties; stress assessment methods; primary stress management methods for different specialties. Preferred form of information assimilation was CL+EL. Aspects of the course facilitated by EL versus CL model: Active reading of theoretical information, with fast access to keywords details; watching documentaries in everyone's favorite order; practice through tests and the rapid control of results. The first four EL advantages, mentioned for SF were: Autonomy in managing the time allocated to the study; saving time for traveling to the venue; the ability to read information in various contexts of time and space; communication with colleagues, in good times for everyone. The first three EL disadvantages, mentioned for SF were: It decreases capabilities for group discussion and mobilization for active participation; EL information accession may depend on electrical source or/and Internet; learning slowdown can appear, by temptation of postponing the implementation. Answering questions was partially influenced by the respondent's age and genre. Conclusions: 1) Post-graduate courses in SF are of interest to doctors of different specialties. 2) The majority of participating doctors preferred EL, but combined with CL (CL+EL). 3) Preference for EL was manifested mainly by young or middle age men doctors. 4) It is important to balance the proper formula for chosen EL, to be the most efficient, interesting, useful and agreeable.

Keywords: stress field, doctors’ postgraduate courses, classical lectures, e-learning lecture

Procedia PDF Downloads 223
14751 Seismic Performance of Slopes Subjected to Earthquake Mainshock Aftershock Sequences

Authors: Alisha Khanal, Gokhan Saygili

Abstract:

It is commonly observed that aftershocks follow the mainshock. Aftershocks continue over a period of time with a decreasing frequency and typically there is not sufficient time for repair and retrofit between a mainshock–aftershock sequence. Usually, aftershocks are smaller in magnitude; however, aftershock ground motion characteristics such as the intensity and duration can be greater than the mainshock due to the changes in the earthquake mechanism and location with respect to the site. The seismic performance of slopes is typically evaluated based on the sliding displacement predicted to occur along a critical sliding surface. Various empirical models are available that predict sliding displacement as a function of seismic loading parameters, ground motion parameters, and site parameters but these models do not include the aftershocks. The seismic risks associated with the post-mainshock slopes ('damaged slopes') subjected to aftershocks is significant. This paper extends the empirical sliding displacement models for flexible slopes subjected to earthquake mainshock-aftershock sequences (a multi hazard approach). A dataset was developed using 144 pairs of as-recorded mainshock-aftershock sequences using the Pacific Earthquake Engineering Research Center (PEER) database. The results reveal that the combination of mainshock and aftershock increases the seismic demand on slopes relative to the mainshock alone; thus, seismic risks are underestimated if aftershocks are neglected.

Keywords: seismic slope stability, mainshock, aftershock, landslide, earthquake, flexible slopes

Procedia PDF Downloads 133
14750 Resource Sharing Issues of Distributed Systems Influences on Healthcare Sector Concurrent Environment

Authors: Soo Hong Da, Ng Zheng Yao, Burra Venkata Durga Kumar

Abstract:

The Healthcare sector is a business that consists of providing medical services, manufacturing medical equipment and drugs as well as providing medical insurance to the public. Most of the time, the data stored in the healthcare database is to be related to patient’s information which is required to be accurate when it is accessed by authorized stakeholders. In distributed systems, one important issue is concurrency in the system as it ensures the shared resources to be synchronized and remains consistent through multiple read and write operations by multiple clients. The problems of concurrency in the healthcare sector are who gets the access and how the shared data is synchronized and remains consistent when there are two or more stakeholders attempting to the shared data simultaneously. In this paper, a framework that is beneficial to distributed healthcare sector concurrent environment is proposed. In the proposed framework, four different level nodes of the database, which are national center, regional center, referral center, and local center are explained. Moreover, the frame synchronization is not symmetrical. There are two synchronization techniques, which are complete and partial synchronization operation are explained. Furthermore, when there are multiple clients accessed at the same time, synchronization types are also discussed with cases at different levels and priorities to ensure data is synchronized throughout the processes.

Keywords: resources, healthcare, concurrency, synchronization, stakeholders, database

Procedia PDF Downloads 136
14749 Taguchi-Based Optimization of Surface Roughness and Dimensional Accuracy in Wire EDM Process with S7 Heat Treated Steel

Authors: Joseph C. Chen, Joshua Cox

Abstract:

This research focuses on the use of the Taguchi method to reduce the surface roughness and improve dimensional accuracy of parts machined by Wire Electrical Discharge Machining (EDM) with S7 heat treated steel material. Due to its high impact toughness, the material is a candidate for a wide variety of tooling applications which require high precision in dimension and desired surface roughness. This paper demonstrates that Taguchi Parameter Design methodology is able to optimize both dimensioning and surface roughness successfully by investigating seven wire-EDM controllable parameters: pulse on time (ON), pulse off time (OFF), servo voltage (SV), voltage (V), servo feed (SF), wire tension (WT), and wire speed (WS). The temperature of the water in the Wire EDM process is investigated as the noise factor in this research. Experimental design and analysis based on L18 Taguchi orthogonal arrays are conducted. This paper demonstrates that the Taguchi-based system enables the wire EDM process to produce (1) high precision parts with an average of 0.6601 inches dimension, while the desired dimension is 0.6600 inches; and (2) surface roughness of 1.7322 microns which is significantly improved from 2.8160 microns.

Keywords: Taguchi Parameter Design, surface roughness, Wire EDM, dimensional accuracy

Procedia PDF Downloads 360
14748 The Effect of Visfatin on Pregnant Mouse Myometrial Contractility in vitro

Authors: Seham Alsaif, Susan Wray

Abstract:

Obesity is a worldwide disorder influencing women’s health and childbearing. There is a close relation between obesity and pregnancy related complications. Dyslipidemia and adipokine dysregulation are core environmental changes that may mechanistically link these complications with obesity in pregnant women. We have previously found that visfatin has a relaxant effect on mouse, rat and human myometrial contractility. We hypothesised that visfatin inhibits mouse myometrial contractility through the NAD+ pathway. This study was designed to examine the mechanism of action of visfatin on myometrial contractility. To examine the NAD+ pathway, FK866 which is a potent inhibitor of NAD+ biosynthesis was used. Methods: Myometrial strips from term pregnant mice were dissected, superfused with physiological saline and the effects of visfatin (10nM) on oxytocin-induced contractions (0.5nM) alone and after the infusion of FK866 (10uM) were studied. After regular contractions were established, contractility was examined for control (100%) and test response at 37 °C for 10 min each. Results: FK866 was found to inhibit the effect of visfatin on myometrial contractility (the AUC increased from 89±2% of control, P=0.0009 for visfatin alone to 97±4% of control, P>0.05 for visfatin combined with FK866, n=8). In conclusion, NAD+ pathway appears to be involved in the mechanism of action of visfatin on mouse myometrium. This could have a role in making new targets to prevent obesity-related complications.

Keywords: myometrium, obesity, oxytocin, pregnancy, visfatin

Procedia PDF Downloads 164
14747 Analyzing How Working From Home Can Lead to Higher Job Satisfaction for Employees Who Have Care Responsibilities Using Structural Equation Modeling

Authors: Christian Louis Kühner, Florian Pfeffel, Valentin Nickolai

Abstract:

Taking care of children, dependents, or pets can be a difficult and time-consuming task. Especially for part- and full-time employees, it can feel exhausting and overwhelming to meet these obligations besides working a job. Thus, working mostly at home and not having to drive to the company can save valuable time and stress. This study aims to show the influence that the working model has on the job satisfaction of employees with care responsibilities in comparison to employees who do not have such obligations. Using structural equation modeling (SEM), the three work models, “work from home”, “working remotely”, and a hybrid model, have been analyzed based on 13 influencing constructs on job satisfaction. These 13 factors have been further summarized into three groups “classic influencing factors”, “influencing factors changed by remote working”, and “new remote working influencing factors”. Based on the influencing factors on job satisfaction, an online survey was conducted with n = 684 employees from the service sector. Here, Cronbach’s alpha of the individual constructs was shown to be suitable. Furthermore, the construct validity of the constructs was confirmed by face validity, content validity, convergent validity (AVE > 0.5: CR > 0.7), and discriminant validity. In addition, confirmatory factor analysis (CFA) confirmed the model fit for the investigated sample (CMIN/DF: 2.567; CFI: 0.927; RMSEA: 0.048). The SEM-analysis has shown that the most significant influencing factor on job satisfaction is “identification with the work” with β = 0.540, followed by “Appreciation” (β = 0.151), “Compensation” (β = 0.124), “Work-Life-Balance” (β = 0.116), and “Communication and Exchange of Information” (β = 0.105). While the significance of each factor can vary depending on the work model, the SEM-analysis shows that the identification with the work is the most significant factor in all three work models and, in the case of the traditional office work model, it is the only significant influencing factor. The study shows that among the employees with care responsibilities, the higher the proportion of working from home in comparison to working from the office, the more satisfied the employees are with their job. Since the work models that meet the requirements of comprehensive care led to higher job satisfaction amongst employees with such obligations, adapting as a company to such private obligations by employees can be crucial to sustained success. Conversely, the satisfaction level of the working model where employees work at the office is higher for workers without caregiving responsibilities.

Keywords: care responsibilities, home office, job satisfaction, structural equation modeling

Procedia PDF Downloads 69
14746 Understanding How to Increase Restorativeness of Interiors: A Qualitative Exploratory Study on Attention Restoration Theory in Relation to Interior Design

Authors: Hande Burcu Deniz

Abstract:

People in the U.S. spend a considerable portion of their time indoors. This makes it crucial to provide environments that support the well-being of people. Restorative environments aim to help people recover their cognitive resources that were spent due to intensive use of directed attention. Spending time in nature and taking a nap are two of the best ways to restore these resources. However, they are not possible to do most of the time. The problem is that many studies have revealed how nature and spending time in natural contexts can help boost restoration, but there are fewer studies conducted to understand how cognitive resources can be restored in interior settings. This study aims to explore the answer to this question: which qualities of interiors increase the restorativeness of an interior setting and how do they mediate restorativeness of an interior. To do this, a phenomenological qualitative study was conducted. The study was interested in the definition of attention restoration and the experiences of the phenomena. As the themes emerged, they were analyzed to match with Attention Restoration Theory components (being away, extent, fascination, compatibility) to examine how interior design elements mediate the restorativeness of an interior. The data was gathered from semi-structured interviews with international residents of Minnesota. The interviewees represent young professionals who work in Minnesota and often experience mental fatigue. Also, they have less emotional connections with places in Minnesota, which enabled data to be based on the physical qualities of a space rather than emotional connections. In the interviews, participants were asked about where they prefer to be when they experience mental fatigue. Next, they were asked to describe the physical qualities of the places they prefer to be with reasons. Four themes were derived from the analysis of interviews. The themes are in order according to their frequency. The first, and most common, the theme was “connection to outside”. The analysis showed that people need to be either physically or visually connected to recover from mental fatigue. Direct connection to nature was reported as preferable, whereas urban settings were the secondary preference along with interiors. The second theme emerged from the analysis was “the presence of the artwork,” which was experienced differently by the interviewees. The third theme was “amenities”. Interviews pointed out that people prefer to have the amenities that support desired activity during recovery from mental fatigue. The last theme was “aesthetics.” Interviewees stated that they prefer places that are pleasing to their eyes. Additionally, they could not get rid of the feeling of being worn out in places that are not well-designed. When we matched the themes with the four art components (being away, extent, fascination, compatibility), some of the interior qualities showed overlapping since they were experienced differently by the interviewees. In conclusion, this study showed that interior settings have restorative potential, and they are multidimensional in their experience.

Keywords: attention restoration, fatigue, interior design, qualitative study, restorative environments

Procedia PDF Downloads 240
14745 Investigation of an Alkanethiol Modified Au Electrode as Sensor for the Antioxidant Activity of Plant Compounds

Authors: Dana A. Thal, Heike Kahlert, Fritz Scholz

Abstract:

Thiol molecules are known to easily form self-assembled monolayers (SAM) on Au surfaces. Depending on the thiol’s structure, surface modifications via SAM can be used for electrode sensor development. In the presented work, 1-decanethiol coated polycrystalline Au electrodes were applied to indirectly assess the radical scavenging potential of plant compounds and extracts. Different plant compounds with reported antioxidant properties as well as an extract from the plant Gynostemma pentaphyllum were tested for their effectiveness to prevent SAM degradation on the sensor electrodes via photolytically generated radicals in aqueous media. The SAM degradation was monitored over time by differential pulse voltammetry (DPV) measurements. The results were compared to established antioxidant assays. The obtained data showed an exposure time and concentration dependent degradation process of the SAM at the electrode’s surfaces. The tested substances differed in their capacity to prevent SAM degradation. Calculated radical scavenging activities of the tested plant compounds were different for different assays. The presented method poses a simple system for radical scavenging evaluation and, considering the importance of the test system in antioxidant activity evaluation, might be taken as a bridging tool between in-vivo and in-vitro antioxidant assay in order to obtain more biologically relevant results in antioxidant research.

Keywords: alkanethiol SAM, plant antioxidant, polycrystalline Au, radical scavenger

Procedia PDF Downloads 287
14744 The Effect of Isokinetic Fatigue of Ankle, Knee, and Hip Muscles on the Dynamic Postural Stability Index

Authors: Masoumeh Shojaei, Natalie Gedayloo, Amir Sarshin

Abstract:

The purpose of the present study was to investigate the effect of Isokinetic fatigue of muscles around the ankle, knee, and hip on the indicators of dynamic postural stability. Therefore, 15 female university students (age 19.7± 0.6 years old, weight 54.6± 9.4 kg, and height 163.9± 5.6 cm) participated in within-subjects design for 5 different days. In the first session, the postural stability indices (time to stabilization after jump-landing) without fatigue were assessed by force plate and in each next sessions, one of muscle groups of the lower limb including the muscles around ankles, knees, and hip was randomly exhausted by Biodex Isokinetic dynamometer and the indices were assessed immediately after the fatigue of each muscle group. The method involved landing on a force plate from a dynamic state, and transitioning balance into a static state. Results of ANOVA with repeated measures indicated that there was no significant difference between the time to stabilization (TTS) before and after Isokinetic fatigue of the muscles around the ankle, knee and hip in medial – lateral direction (p > 0.05), but in the anterior – posterior (AP) direction, the difference was statistically significant (p < 0.05). Least Significant Difference (LSD) post hoc test results also showed that there was significant difference between TTS in knee and hip muscles before and after isokinetic fatigue in AP direction. In the other hand knee and hip muscles group were affected by isokinetic fatigue only in AP surface (p < 0.05).

Keywords: dynamic balance, fatigue, lower limb muscles, postural control

Procedia PDF Downloads 221
14743 Overview of Multi-Chip Alternatives for 2.5 and 3D Integrated Circuit Packagings

Authors: Ching-Feng Chen, Ching-Chih Tsai

Abstract:

With the size of the transistor gradually approaching the physical limit, it challenges the persistence of Moore’s Law due to the development of the high numerical aperture (high-NA) lithography equipment and other issues such as short channel effects. In the context of the ever-increasing technical requirements of portable devices and high-performance computing, relying on the law continuation to enhance the chip density will no longer support the prospects of the electronics industry. Weighing the chip’s power consumption-performance-area-cost-cycle time to market (PPACC) is an updated benchmark to drive the evolution of the advanced wafer nanometer (nm). The advent of two and half- and three-dimensional (2.5 and 3D)- Very-Large-Scale Integration (VLSI) packaging based on Through Silicon Via (TSV) technology has updated the traditional die assembly methods and provided the solution. This overview investigates the up-to-date and cutting-edge packaging technologies for 2.5D and 3D integrated circuits (ICs) based on the updated transistor structure and technology nodes. The author concludes that multi-chip solutions for 2.5D and 3D IC packagings are feasible to prolong Moore’s Law.

Keywords: moore’s law, high numerical aperture, power consumption-performance-area-cost-cycle time to market, 2.5 and 3D- very-large-scale integration, packaging, through silicon via

Procedia PDF Downloads 106
14742 The Legality of the Individual Education Plan from the Teachers’ Perspective in Saudi Arabia

Authors: Sohil I. Alqazlan

Abstract:

Introduction and Objectives: The individual educational plans (IEPs) is the cornerstone in education for students with special education need (SEN). The Saudi government supported the students’ right to have an IEP, and their education is one of the primary goals for the Ministry of Education (MoE). However, this support does not reflect the huge government investment. For example, some SEN students do not have an IEP, and poor communication was found between IEP teams and student's families. As a result, this study investigated perspectives and understandings of the IEP from the views of SEN teachers in the Saudi context. Methods: This study design utilised a qualitative approach, where in-depth semi-structured interviews were used with 8 SEN teachers in Riyadh (the capital city of Saudi Arabia) schools. In terms of analysing the interviews’ findings, the researcher used the thematic analyses approach. Results and Conclusion: The legality and the consideration of the legal document in Saudi Arabia are the main areas wherein study participants were questioned. It was observed that the IEP is not considered a legal document in the region of Saudi Arabia. As interpreted from the response of the SEN teachers, the IEP lacks the required legality with respect to its implementation in Saudi Arabia. All teachers were in agreement that the IEP is not considered to be a legal document in the Kingdom of Saudi Arabia. As a result, they did not use it for all their students with SEN. Such findings might have affected the teaching quality, and school outcomes as all SEN students must be supported individually depending on their needs.

Keywords: individual education plan, special education, IEP, teachers

Procedia PDF Downloads 156
14741 Cartagena Protocol and Beyond: Issues and Challenges in the Nigeria's Response to Biosafety

Authors: Dalhat Binta Dan - Ali

Abstract:

The reality of the new world economic order and the ever increasing importance of biotechnology in the global economy have necessitated the ratification of the Cartagena Protocol on Biosafety and the recent promulgation of Biosafety Act in Nigeria 2015. The legal regimes are anchored on the need to create an enabling environment for the flourishing of bio-trade and also to ensure the safety of the environment and human health. This paper critically examines the legal framework on biosafety by taking a cursory look at its philosophical foundation, key issues and milestones. The paper argues that the extant laws, though a giant leap in the establishment of a legal framework on biosafety, it posits that the legal framework raises debate and controversy on the difficulties of risk assessment on biodiversity and human health, other challenges includes lack of sound institutional capacity and the regimes direction of a hybrid approach between environmental conservation and trade issues. The paper recommend the need for the country to do more in the area of stimulating awareness and establishment of a sound institutional capacity to enable the law ensure adequate level of protection in the field of safe transfer, handling, and use of genetically modified organisms (GMOs) in Nigeria.

Keywords: Cartagena protocol, biosafety, issues, challenges, biotrade, genetically modified organism (GMOs), environment

Procedia PDF Downloads 312
14740 Chipless RFID Capacity Enhancement Using the E-pulse Technique

Authors: Haythem H. Abdullah, Hesham Elkady

Abstract:

With the fast increase in radio frequency identification (RFID) applications such as medical recording, library management, etc., the limitation of active tags stems from its need to external batteries as well as passive or active chips. The chipless RFID tag reduces the cost to a large extent but at the expense of utilizing the spectrum. The reduction of the cost of chipless RFID is due to the absence of the chip itself. The identification is done by utilizing the spectrum in such a way that the frequency response of the tags consists of some resonance frequencies that represent the bits. The system capacity is decided by the number of resonators within the pre-specified band. It is important to find a solution to enhance the spectrum utilization when using chipless RFID. Target identification is a process that results in a decision that a specific target is present or not. Several target identification schemes are present, but one of the most successful techniques in radar target identification in the oscillatory region is the extinction pulse technique (E-Pulse). The E-Pulse technique is used to identify targets via its characteristics (natural) modes. By introducing an innovative solution for chipless RFID reader and tag designs, the spectrum utilization goes to the optimum case. In this paper, a novel capacity enhancement scheme based on the E-pulse technique is introduced to improve the performance of the chipless RFID system.

Keywords: chipless RFID, E-pulse, natural modes, resonators

Procedia PDF Downloads 57
14739 Stray Light Reduction Methodology by a Sinusoidal Light Modulation and Three-Parameter Sine Curve Fitting Algorithm for a Reflectance Spectrometer

Authors: Hung Chih Hsieh, Cheng Hao Chang, Yun Hsiang Chang, Yu Lin Chang

Abstract:

In the applications of the spectrometer, the stray light that comes from the environment affects the measurement results a lot. Hence, environment and instrument quality control for the stray reduction is critical for the spectral reflectance measurement. In this paper, a simple and practical method has been developed to correct a spectrometer's response for measurement errors arising from the environment's and instrument's stray light. A sinusoidal modulated light intensity signal was incident on a tested sample, and then the reflected light was collected by the spectrometer. Since a sinusoidal signal modulated the incident light, the reflected light also had a modulated frequency which was the same as the incident signal. Using the three-parameter sine curve fitting algorithm, we can extract the primary reflectance signal from the total measured signal, which contained the primary reflectance signal and the stray light from the environment. The spectra similarity between the extracted spectra by this proposed method with extreme environment stray light is 99.98% similar to the spectra without the environment's stray light. This result shows that we can measure the reflectance spectra without the affection of the environment's stray light.

Keywords: spectrometer, stray light, three-parameter sine curve fitting, spectra extraction

Procedia PDF Downloads 226
14738 Identification of Microbial Community in an Anaerobic Reactor Treating Brewery Wastewater

Authors: Abimbola M. Enitan, John O. Odiyo, Feroz M. Swalaha

Abstract:

The study of microbial ecology and their function in anaerobic digestion processes are essential to control the biological processes. This is to know the symbiotic relationship between the microorganisms that are involved in the conversion of complex organic matter in the industrial wastewater to simple molecules. In this study, diversity and quantity of bacterial community in the granular sludge taken from the different compartments of a full-scale upflow anaerobic sludge blanket (UASB) reactor treating brewery wastewater was investigated using polymerase chain reaction (PCR) and real-time quantitative PCR (qPCR). The phylogenetic analysis showed three major eubacteria phyla that belong to Proteobacteria, Firmicutes and Chloroflexi in the full-scale UASB reactor, with different groups populating different compartment. The result of qPCR assay showed high amount of eubacteria with increase in concentration along the reactor’s compartment. This study extends our understanding on the diverse, topological distribution and shifts in concentration of microbial communities in the different compartments of a full-scale UASB reactor treating brewery wastewater. The colonization and the trophic interactions among these microbial populations in reducing and transforming complex organic matter within the UASB reactors were established.

Keywords: bacteria, brewery wastewater, real-time quantitative PCR, UASB reactor

Procedia PDF Downloads 245
14737 Speaker Identification by Atomic Decomposition of Learned Features Using Computational Auditory Scene Analysis Principals in Noisy Environments

Authors: Thomas Bryan, Veton Kepuska, Ivica Kostanic

Abstract:

Speaker recognition is performed in high Additive White Gaussian Noise (AWGN) environments using principals of Computational Auditory Scene Analysis (CASA). CASA methods often classify sounds from images in the time-frequency (T-F) plane using spectrograms or cochleargrams as the image. In this paper atomic decomposition implemented by matching pursuit performs a transform from time series speech signals to the T-F plane. The atomic decomposition creates a sparsely populated T-F vector in “weight space” where each populated T-F position contains an amplitude weight. The weight space vector along with the atomic dictionary represents a denoised, compressed version of the original signal. The arraignment or of the atomic indices in the T-F vector are used for classification. Unsupervised feature learning implemented by a sparse autoencoder learns a single dictionary of basis features from a collection of envelope samples from all speakers. The approach is demonstrated using pairs of speakers from the TIMIT data set. Pairs of speakers are selected randomly from a single district. Each speak has 10 sentences. Two are used for training and 8 for testing. Atomic index probabilities are created for each training sentence and also for each test sentence. Classification is performed by finding the lowest Euclidean distance between then probabilities from the training sentences and the test sentences. Training is done at a 30dB Signal-to-Noise Ratio (SNR). Testing is performed at SNR’s of 0 dB, 5 dB, 10 dB and 30dB. The algorithm has a baseline classification accuracy of ~93% averaged over 10 pairs of speakers from the TIMIT data set. The baseline accuracy is attributable to short sequences of training and test data as well as the overall simplicity of the classification algorithm. The accuracy is not affected by AWGN and produces ~93% accuracy at 0dB SNR.

Keywords: time-frequency plane, atomic decomposition, envelope sampling, Gabor atoms, matching pursuit, sparse dictionary learning, sparse autoencoder

Procedia PDF Downloads 280
14736 Transient Analysis and Mitigation of Capacitor Bank Switching on a Standalone Wind Farm

Authors: Ajibola O. Akinrinde, Andrew Swanson, Remy Tiako

Abstract:

There exist significant losses on transmission lines due to distance, as power generating stations could be located far from some isolated settlements. Standalone wind farms could be a good choice of alternative power generation for such settlements that are far from the grid due to factors of long distance or socio-economic problems. However, uncompensated wind farms consume reactive power since wind turbines are induction generators. Therefore, capacitor banks are used to compensate reactive power, which in turn improves the voltage profile of the network. Although capacitor banks help improving voltage profile, they also undergo switching actions due to its compensating response to the variation of various types of load at the consumer’s end. These switching activities could cause transient overvoltage on the network, jeopardizing the end-life of other equipment on the system. In this paper, the overvoltage caused by these switching activities is investigated using the IEEE bus 14-network to represent a standalone wind farm, and the simulation is done using ATP/EMTP software. Scenarios involving the use of pre-insertion resistor and pre-insertion inductor, as well as controlled switching was also carried out in order to decide the best mitigation option to reduce the overvoltage.

Keywords: capacitor banks, IEEE bus 14-network, pre-insertion resistor, standalone wind farm

Procedia PDF Downloads 429
14735 The Impact of Distributed Epistemologies on Software Engineering

Authors: Thomas Smith

Abstract:

Many hackers worldwide would agree that, had it not been for linear-time theory, the refinement of Byzantine fault tolerance might never have occurred. After years of significant research into extreme programming, we validate the refinement of simulated annealing. Maw, our new framework for unstable theory, is the solution to all of these issues.

Keywords: distributed, software engineering, DNS, DHCP

Procedia PDF Downloads 336
14734 Stackelberg Security Game for Optimizing Security of Federated Internet of Things Platform Instances

Authors: Violeta Damjanovic-Behrendt

Abstract:

This paper presents an approach for optimal cyber security decisions to protect instances of a federated Internet of Things (IoT) platform in the cloud. The presented solution implements the repeated Stackelberg Security Game (SSG) and a model called Stochastic Human behaviour model with AttRactiveness and Probability weighting (SHARP). SHARP employs the Subjective Utility Quantal Response (SUQR) for formulating a subjective utility function, which is based on the evaluations of alternative solutions during decision-making. We augment the repeated SSG (including SHARP and SUQR) with a reinforced learning algorithm called Naïve Q-Learning. Naïve Q-Learning belongs to the category of active and model-free Machine Learning (ML) techniques in which the agent (either the defender or the attacker) attempts to find an optimal security solution. In this way, we combine GT and ML algorithms for discovering optimal cyber security policies. The proposed security optimization components will be validated in a collaborative cloud platform that is based on the Industrial Internet Reference Architecture (IIRA) and its recently published security model.

Keywords: security, internet of things, cloud computing, stackelberg game, machine learning, naive q-learning

Procedia PDF Downloads 341