Search results for: perception reaction time
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7274

Search results for: perception reaction time

4754 Spreading Dynamics of a Viral Infection in a Complex Network

Authors: Khemanand Moheeput, Smita S. D. Goorah, Satish K. Ramchurn

Abstract:

We report a computational study of the spreading dynamics of a viral infection in a complex (scale-free) network. The final epidemic size distribution (FESD) was found to be unimodal or bimodal depending on the value of the basic reproductive number R0 . The FESDs occurred on time-scales long enough for intermediate-time epidemic size distributions (IESDs) to be important for control measures. The usefulness of R0 for deciding on the timeliness and intensity of control measures was found to be limited by the multimodal nature of the IESDs and by its inability to inform on the speed at which the infection spreads through the population. A reduction of the transmission probability at the hubs of the scale-free network decreased the occurrence of the larger-sized epidemic events of the multimodal distributions. For effective epidemic control, an early reduction in transmission at the index cell and its neighbors was essential.

Keywords: Basic reproductive number, epidemic control, scalefree network, viral infection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1720
4753 An AFM Approach of RBC Micro and Nanoscale Topographic Features during Storage

Authors: K. Santacruz-Gomez, E. Silva-Campa, S. Álvarez-García, V. Mata-Haro, D. Soto-Puebla, M. Pedroza-Montero

Abstract:

Blood gamma irradiation is the only available method to prevent transfusion associated graft versus host disease (TAGVHD). However, when blood is irradiated, determine blood shelf time is crucial. Non irradiated blood have a self-time from 21 to 35 days when is preserved with anticoagulated solution and stored at 4°C. During their storage, red blood cells (RBC) undergo a series of biochemical, biomechanical and molecular changes involving what is known as storage lesion (SL). SL include loss of structural integrity of RBC, decrease of 2,3-diphosphatidylglyceric acid levels, and increase of both ion potassium concentration and hemoglobin (Hb). On the other hand, Atomic force Microscopy (AFM) represents a versatile tool for a nano-scale high resolution topographic analysis in biological systems. In order to evaluate SL in irradiated and nonirradiated blood, RBC topography and morphometric parameters were obtained from an AFM XE-BIO system. Cell viability was followed using flow cytometry. Our results showed that early markers as nanoscale roughness, allow us to evaluate blood quality since other perspective.

Keywords: AFM, Blood γ-irradiation, roughness, Storage lesion.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2697
4752 Probabilistic Model Development for Project Performance Forecasting

Authors: Milad Eghtedari Naeini, Gholamreza Heravi

Abstract:

In this paper, based on the past project cost and time performance, a model for forecasting project cost performance is developed. This study presents a probabilistic project control concept to assure an acceptable forecast of project cost performance. In this concept project activities are classified into sub-groups entitled control accounts. Then obtain the Stochastic S-Curve (SS-Curve), for each sub-group and the project SS-Curve is obtained by summing sub-groups- SS-Curves. In this model, project cost uncertainties are considered through Beta distribution functions of the project activities costs required to complete the project at every selected time sections through project accomplishment, which are extracted from a variety of sources. Based on this model, after a percentage of the project progress, the project performance is measured via Earned Value Management to adjust the primary cost probability distribution functions. Then, accordingly the future project cost performance is predicted by using the Monte-Carlo simulation method.

Keywords: Monte Carlo method, Probabilistic model, Project forecasting, Stochastic S-curve

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2718
4751 Enhancement of Cement Mortar Mechanical Properties with Replacement of Seashell Powder

Authors: Abdoullah Namdar, Fadzil Mat Yahaya

Abstract:

Many synthetic additives have been using for improve cement mortar and concrete characteristics, but natural additive is a friendly environment option. The quantity of (2% and 4%) seashell powder has been replaced in cement mortar, and compared with plain cement mortar in early age of 7 days. The strain gauges have been installed on beams and cube, for monitoring fluctuation of flexural and compressive strength. Main objective of this paper is to study effect of linear static force on flexural and compressive strength of modified cement mortar. The results have been indicated that the replacement of appropriate proportion of seashell powder enhances cement mortar mechanical properties. The replacement of 2% seashell causes improvement of deflection, time to failure and maximum load to failure on concrete beam and cube, the same occurs for compressive modulus elasticity. Increase replacement of seashell to 4% reduces all flexural strength, compressive strength and strain of cement mortar.

Keywords: Compressive strength, flexural strength, compressive modulus elasticity, time to failure, deflection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3451
4750 H-Infinity and RST Position Controllers of Rotary Traveling Wave Ultrasonic Motor

Authors: M. Brahim, I. Bahri, Y. Bernard

Abstract:

Traveling Wave Ultrasonic Motor (TWUM) is a compact, precise, and silent actuator generating high torque at low speed without gears. Moreover, the TWUM has a high holding torque without supply, which makes this motor as an attractive solution for holding position of robotic arms. However, their nonlinear dynamics, and the presence of load-dependent dead zones often limit their use. Those issues can be overcome in closed loop with effective and precise controllers. In this paper, robust H-infinity (H∞) and discrete time RST position controllers are presented. The H∞ controller is designed in continuous time with additional weighting filters to ensure the robustness in the case of uncertain motor model and external disturbances. Robust RST controller based on the pole placement method is also designed and compared to the H∞. Simulink model of TWUM is used to validate the stability and the robustness of the two proposed controllers.

Keywords: Piezoelectric motors, position control, H∞, RST, stability criteria, robustness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 964
4749 Factors of Effective Business Software Systems Development and Enhancement Projects Work Effort Estimation

Authors: Beata Czarnacka-Chrobot

Abstract:

Majority of Business Software Systems (BSS) Development and Enhancement Projects (D&EP) fail to meet criteria of their effectiveness, what leads to the considerable financial losses. One of the fundamental reasons for such projects- exceptionally low success rate are improperly derived estimates for their costs and time. In the case of BSS D&EP these attributes are determined by the work effort, meanwhile reliable and objective effort estimation still appears to be a great challenge to the software engineering. Thus this paper is aimed at presenting the most important synthetic conclusions coming from the author-s own studies concerning the main factors of effective BSS D&EP work effort estimation. Thanks to the rational investment decisions made on the basis of reliable and objective criteria it is possible to reduce losses caused not only by abandoned projects but also by large scale of overrunning the time and costs of BSS D&EP execution.

Keywords: Benchmarking data, business software systems development and enhancement projects, effort estimation, software engineering economics, software functional size measurement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1542
4748 Effects of Corruption and Logistics Performance Inefficiencies on Container Throughput: The Latin America Case

Authors: Fernando Seabra, Giulia P. Flores, Karolina C. Gomes

Abstract:

Trade liberalizations measures, as import tariff cuts, are not a sufficient trigger for trade growth. Given that price margins are narrow, traders and cargo operators tend to opt out of markets where the process of goods clearance is slow and costly. Excess paperwork and slow customs dispatch not only lead to institutional breakdowns and corruption but also to increasing transaction cost and trade constraints. The objective of this paper is, therefore, two-fold: First, to evaluate the relationship between institutional and infrastructural performance indexes and trade growth in container throughput; and, second, to investigate the causes for differences in container demurrage and detention fees in Latin American countries (using other emerging countries as benchmarking). The analysis is focused on manufactured goods, typically transported by containers. Institutional and infrastructure bottlenecks and, therefore, the country logistics efficiency – measured by the Logistics Performance Index (LPI, World Bank-WB) – are compared with other indexes, such as the Doing Business index (WB) and the Corruption Perception Index (Transparency International). The main results based on the comparison between Latin American countries and the others emerging countries point out in that the growth in containers trade is directly related to LPI performance. It has also been found that the main hypothesis is valid as aspects that more specifically identify trade facilitation and corruption are significant drivers of logistics performance. The exam of port efficiency (demurrage and detention fees) has demonstrated that not necessarily higher level of efficiency is related to lower charges; however, reductions in fees have been more significant within non-Latin American emerging countries.

Keywords: Container throughput, logistics performance, corruption, Latin America.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1561
4747 Applications of High Intensity Ultrasound to Modify Millet Protein Concentrate Functionality

Authors: B. Nazari, M. A. Mohammadifar, S. Shojaee-Aliabadi, L. Mirmoghtadaie

Abstract:

Millets as a new source of plant protein were not used in food applications due to its poor functional properties. In this study, the effect of high intensity ultrasound (frequency: 20 kHz, with contentious flow) (US) in 100% amplitude for varying times (5, 12.5, and 20 min) on solubility, emulsifying activity index (EAI), emulsion stability (ES), foaming capacity (FC), and foaming stability (FS) of millet protein concentrate (MPC) were evaluated. In addition, the structural properties of best treatments such as molecular weight and surface charge were compared with the control sample to prove the US effect. The US treatments significantly (P<0.05) increased the solubility of the native MPC (65.8±0.6%) at all sonicated times with the maximum solubility that is recorded at 12.5 min treatment (96.9±0.82 %). The FC of MPC was also significantly affected by the US treatment. Increase in sonicated time up to 12.5 min significantly increased the FC of native MPC (271.03±4.51 ml), but higher increase reduced it significantly. Minimal improvements were observed in the FS of all sonicated MPC compared to the native MPC. Sonicated time for 12.5 min affected the EAI and ES of the native MPC more markedly than 5 and 20 min that may be attributed to higher increase in proteins tendency to adsorption at the oil and water interfaces after the US treatment at this time. SDS-PAGE analysis showed changes in the molecular weight of MPC that attributed to shearing forces created by cavitation phenomenon. Also, this phenomenon caused an increase in the exposure of more amino acids with negative charge in the surface of US treated MPC, that was demonstrated by Zetasizer data. High intensity ultrasound, as a green technology, can significantly increase the functional properties of MPC and can make this usable for food applications.

Keywords: Millet protein concentrate, Functional properties, Structural properties, High intensity ultrasound.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1734
4746 A Bi-Objective Preventive Healthcare Facility Network Design with Incorporating Cost and Time Saving

Authors: Mehdi Seifbarghy, Keyvan Roshan

Abstract:

Main goal of preventive healthcare problems are at decreasing the likelihood and severity of potentially life-threatening illnesses by protection and early detection. The levels of establishment and staffing costs along with summation of the travel and waiting time that clients spent are considered as objectives functions of the proposed nonlinear integer programming model. In this paper, we have proposed a bi-objective mathematical model for designing a network of preventive healthcare facilities so as to minimize aforementioned objectives, simultaneously. Moreover, each facility acts as M/M/1 queuing system. The number of facilities to be established, the location of each facility, and the level of technology for each facility to be chosen are provided as the main determinants of a healthcare facility network. Finally, to demonstrate performance of the proposed model, four multi-objective decision making techniques are presented to solve the model.

Keywords: Preventive healthcare problems, Non-linear integer programming models, Multi-objective decision making techniques

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1770
4745 Improving the Analytical Power of Dynamic DEA Models, by the Consideration of the Shape of the Distribution of Inputs/Outputs Data: A Linear Piecewise Decomposition Approach

Authors: Elias K. Maragos, Petros E. Maravelakis

Abstract:

In Dynamic Data Envelopment Analysis (DDEA), which is a subfield of Data Envelopment Analysis (DEA), the productivity of Decision Making Units (DMUs) is considered in relation to time. In this case, as it is accepted by the most of the researchers, there are outputs, which are produced by a DMU to be used as inputs in a future time. Those outputs are known as intermediates. The common models, in DDEA, do not take into account the shape of the distribution of those inputs, outputs or intermediates data, assuming that the distribution of the virtual value of them does not deviate from linearity. This weakness causes the limitation of the accuracy of the analytical power of the traditional DDEA models. In this paper, the authors, using the concept of piecewise linear inputs and outputs, propose an extended DDEA model. The proposed model increases the flexibility of the traditional DDEA models and improves the measurement of the dynamic performance of DMUs.

Keywords: Data envelopment analysis, Dynamic DEA, Piecewise linear inputs, Piecewise linear outputs.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 656
4744 2-DOF Observer Based Controller for First Order with Dead Time Systems

Authors: Ashu Ahuja, Shiv Narayan, Jagdish Kumar

Abstract:

This paper realized the 2-DOF controller structure for first order with time delay systems. The co-prime factorization is used to design observer based controller K(s), representing one degree of freedom. The problem is based on H∞ norm of mixed sensitivity and aims to achieve stability, robustness and disturbance rejection. Then, the other degree of freedom, prefilter F(s), is formulated as fixed structure polynomial controller to meet open loop processing of reference model. This model matching problem is solved by minimizing integral square error between reference model and proposed model. The feedback controller and prefilter designs are posed as optimization problem and solved using Particle Swarm Optimization (PSO). To show the efficiency of the designed approach different variety of processes are taken and compared for analysis.

Keywords: 2-DOF, integral square error, mixed sensitivity function, observer based controller, particle swarm optimization, prefilter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2432
4743 Methane versus Carbon Dioxide: Mitigation Prospects

Authors: Alexander J. Severinsky, Allen L. Sessoms

Abstract:

Atmospheric carbon dioxide (CO2) has dominated the discussion around the causes of climate change. This is a reflection of a 100-year time horizon for all greenhouse gases that became a norm.  The 100-year time horizon is much too long – and yet, almost all mitigation efforts, including those set in the near-term frame of within 30 years, are still geared toward it. In this paper, we show that for a 30-year time horizon, methane (CH4) is the greenhouse gas whose radiative forcing exceeds that of CO2. In our analysis, we use the radiative forcing of greenhouse gases in the atmosphere, because they directly affect the rise in temperature on Earth. We found that in 2019, the radiative forcing (RF) of methane was ~2.5 W/m2 and that of carbon dioxide was ~2.1 W/m2. Under a business-as-usual (BAU) scenario until 2050, such forcing would be ~2.8 W/m2 and ~3.1 W/m2 respectively. There is a substantial spread in the data for anthropogenic and natural methane (CH4) emissions, along with natural gas, (which is primarily CH4), leakages from industrial production to consumption. For this reason, we estimate the minimum and maximum effects of a reduction of these leakages, and assume an effective immediate reduction by 80%. Such action may serve to reduce the annual radiative forcing of all CH4 emissions by ~15% to ~30%. This translates into a reduction of RF by 2050 from ~2.8 W/m2 to ~2.5 W/m2 in the case of the minimum effect that can be expected, and to ~2.15 W/m2 in the case of the maximum effort to reduce methane leakages. Under the BAU, we find that the RF of CO2 will increase from ~2.1 W/m2 now to ~3.1 W/m2 by 2050. We assume a linear reduction of 50% in anthropogenic emission over the course of the next 30 years, which would reduce the radiative forcing of CO2 from ~3.1 W/m2 to ~2.9 W/m2. In the case of "net zero," the other 50% of only anthropogenic CO2 emissions reduction would be limited to being either from sources of emissions or directly from the atmosphere. In this instance, the total reduction would be from ~3.1 W/m2 to ~2.7 W/m2, or ~0.4 W/m2. To achieve the same radiative forcing as in the scenario of maximum reduction of methane leakages of ~2.15 W/m2, an additional reduction of radiative forcing of CO2 would be approximately 2.7 -2.15 = 0.55 W/m2. In total, one would need to remove ~660 GT of CO2 from the atmosphere in order to match the maximum reduction of current methane leakages, and ~270 GT of CO2 from emitting sources, to reach "negative emissions". This amounts to over 900 GT of CO2.

Keywords: Methane Leakages, Methane Radiative Forcing, Methane Mitigation, Methane Net Zero.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 646
4742 Improving TNT Curing Process by Using Infrared Camera

Authors: O. Srihakulung, Y. Soongsumal

Abstract:

Among the chemicals used for ammunition production, TNT (Trinitrotoluene) play a significant role since World War I and II. Various types of military weapon utilize TNT in casting process. However, the TNT casting process for warhead is difficult to control the cooling rate of the liquid TNT. This problem occurs because the casting process lacks the equipment to detect the temperature during the casting procedure This study presents the temperature detected by infrared camera to illustrate the cooling rate and cooling zone of curing, and demonstrates the optimization of TNT condition to reduce the risk of air gap occurred in the warhead which can result in the destruction afterward. Premature initiation of explosive-filled projectiles in response to set-back forces during gunfiring cause by casting defects. Finally the study can help improving the process of the TNT casting. The operators can control the curing of TNT inside the case by rising up the heating rod at the proper time. Consequently this can reduce tremendous time of rework if the air gaps occur and increase strength to lower elastic modulus. Therefore, it can be clearly concluded that the use of Infrared Cameras in this process is another method to improve the casting procedure.

Keywords: Infrared camera, TNT casting, warhead, curing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2265
4741 Tool Damage and Adhesion Effects in Turning and Drilling of Hardened Steels

Authors: Chris M. Taylor, Ian Cook, Raul Alegre, Pedro Arrazola, Phil Spiers

Abstract:

Noteworthy results have been obtained in the turning and drilling of hardened high-strength steels using tungsten carbide based cutting tools. In a finish turning process, it was seen that surface roughness and tool flank wear followed very different trends against cutting time. The suggested explanation for this behaviour is that the profile cut into the workpiece surface is determined by the tool’s cutting edge profile. It is shown that the profile appearing on the cut surface changes rapidly over time, so the profile of the tool cutting edge should also be changing rapidly. Workpiece material adhered onto the cutting tool, which is also known as a built-up edge, is a phenomenon which could explain the observations made. In terms of tool damage modes, workpiece material adhesion is believed to have contributed to tool wear in examples provided from finish turning, thread turning and drilling. Additionally, evidence of tool fracture and tool abrasion were recorded.

Keywords: Turning, drilling, adhesion, wear, hard steels.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1424
4740 Synthesis and Application of Tamarind Hydroxypropane Sulphonic Acid Resin for Removal of Heavy Metal Ions from Industrial Wastewater

Authors: Aresh Vikram Singh, Sarika Nagar

Abstract:

The tamarind based resin containing hydroxypropane sulphonic acid groups has been synthesized and their adsorption behavior for heavy metal ions has been investigated using batch and column experiments. The hydroxypropane sulphonic acid group has been incorporated onto tamarind by a modified Porath's method of functionalisation of polysaccharides. The tamarind hydroxypropane sulphonic acid (THPSA) resin can selectively remove of heavy metal ions, which are contained in industrial wastewater. The THPSA resin was characterized by FTIR and thermogravimetric analysis. The effects of various adsorption conditions, such as pH, treatment time and adsorbent dose were also investigated. The optimum adsorption condition was found at pH 6, 120 minutes of equilibrium time and 0.1 gram of resin dose. The orders of distribution coefficient values were determined.

Keywords: Distribution coefficient, industrial wastewater, polysaccharides, tamarind hydroxypropane sulphonic acid resin, thermogravimetric analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 967
4739 Secure Resource Selection in Computational Grid Based on Quantitative Execution Trust

Authors: G.Kavitha, V.Sankaranarayanan

Abstract:

Grid computing provides a virtual framework for controlled sharing of resources across institutional boundaries. Recently, trust has been recognised as an important factor for selection of optimal resources in a grid. We introduce a new method that provides a quantitative trust value, based on the past interactions and present environment characteristics. This quantitative trust value is used to select a suitable resource for a job and eliminates run time failures arising from incompatible user-resource pairs. The proposed work will act as a tool to calculate the trust values of the various components of the grid and there by improves the success rate of the jobs submitted to the resource on the grid. The access to a resource not only depend on the identity and behaviour of the resource but also upon its context of transaction, time of transaction, connectivity bandwidth, availability of the resource and load on the resource. The quality of the recommender is also evaluated based on the accuracy of the feedback provided about a resource. The jobs are submitted for execution to the selected resource after finding the overall trust value of the resource. The overall trust value is computed with respect to the subjective and objective parameters.

Keywords: access control, feedback, grid computing, reputation, security, trust, trust parameter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1488
4738 Effect of Oxytocin on Cytosolic Calcium Concentration of Alpha and Beta Cells in Pancreas

Authors: Rauza Sukma Rita, Katsuya Dezaki, Yuko Maejima, Toshihiko Yada

Abstract:

Oxytocin is a nine-amino acid peptide synthesized in the paraventricular nucleus (PVN) and supraoptic nucleus (SON) of the hypothalamus. Oxytocin promotes contraction of the uterus during birth and milk ejection during breast feeding. Although oxytocin receptors are found predominantly in the breasts and uterus of females, many tissues and organs express oxytocin receptors, including the pituitary, heart, kidney, thymus, vascular endothelium, adipocytes, osteoblasts, adrenal gland, pancreatic islets, and many cell lines. On the other hand, in pancreatic islets, oxytocin receptors are expressed in both α-cells and β-cells with stronger expression in α- cells. However, to our knowledge there are no reports yet about the effect of oxytocin on cytosolic calcium reaction on α and β-cell. This study aims to investigate the effect of oxytocin on α-cells and β-cells and its oscillation pattern. Islet of Langerhans from wild type mice were isolated by collagenase digestion. Isolated and dissociated single cells either α-cells or β-cells on coverslips were mounted in an open chamber and superfused in HKRB. Cytosolic concentration ([Ca2+]i) in single cells were measured by fura-2 microfluorimetry. After measurement of [Ca2+]i, α-cells were identified by subsequent immunocytochemical staining using an anti-glucagon antiserum. In β-cells, the [Ca2+]i increase in response to oxytocin was observed only under 8.3 mM glucose condition, whereas in α-cells, [Ca2+]i an increase induced by oxytocin was observed in both 2.8 mM and 8.3 mM glucose. The oscillation incidence was induced more frequently in β-cells compared to α-cells. In conclusion, the present study demonstrated that oxytocin directly interacts with both α-cells and β-cells and induces increase of [Ca2+]i and its specific patterns.

Keywords: α-cells, β-cells, cytosolic calcium concentration, oscillation, oxytocin.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1090
4737 An Analysis of Real-Time Distributed System under Different Priority Policies

Authors: Y. Jayanta Singh, Suresh C. Mehrotra

Abstract:

A real time distributed computing has heterogeneously networked computers to solve a single problem. So coordination of activities among computers is a complex task and deadlines make more complex. The performances depend on many factors such as traffic workloads, database system architecture, underlying processors, disks speeds, etc. Simulation study have been performed to analyze the performance under different transaction scheduling: different workloads, arrival rate, priority policies, altering slack factors and Preemptive Policy. The performance metric of the experiments is missed percent that is the percentage of transaction that the system is unable to complete. The throughput of the system is depends on the arrival rate of transaction. The performance can be enhanced with altering the slack factor value. Working on slack value for the transaction can helps to avoid some of transactions from killing or aborts. Under the Preemptive Policy, many extra executions of new transactions can be carried out.

Keywords: Real distributed systems, slack factors, transaction scheduling, priority policies.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1626
4736 Robust Batch Process Scheduling in Pharmaceutical Industries: A Case Study

Authors: Tommaso Adamo, Gianpaolo Ghiani, Antonio D. Grieco, Emanuela Guerriero

Abstract:

Batch production plants provide a wide range of scheduling problems. In pharmaceutical industries a batch process is usually described by a recipe, consisting of an ordering of tasks to produce the desired product. In this research work we focused on pharmaceutical production processes requiring the culture of a microorganism population (i.e. bacteria, yeasts or antibiotics). Several sources of uncertainty may influence the yield of the culture processes, including (i) low performance and quality of the cultured microorganism population or (ii) microbial contamination. For these reasons, robustness is a valuable property for the considered application context. In particular, a robust schedule will not collapse immediately when a cell of microorganisms has to be thrown away due to a microbial contamination. Indeed, a robust schedule should change locally in small proportions and the overall performance measure (i.e. makespan, lateness) should change a little if at all. In this research work we formulated a constraint programming optimization (COP) model for the robust planning of antibiotics production. We developed a discrete-time model with a multi-criteria objective, ordering the different criteria and performing a lexicographic optimization. A feasible solution of the proposed COP model is a schedule of a given set of tasks onto available resources. The schedule has to satisfy tasks precedence constraints, resource capacity constraints and time constraints. In particular time constraints model tasks duedates and resource availability time windows constraints. To improve the schedule robustness, we modeled the concept of (a, b) super-solutions, where (a, b) are input parameters of the COP model. An (a, b) super-solution is one in which if a variables (i.e. the completion times of a culture tasks) lose their values (i.e. cultures are contaminated), the solution can be repaired by assigning these variables values with a new values (i.e. the completion times of a backup culture tasks) and at most b other variables (i.e. delaying the completion of at most b other tasks). The efficiency and applicability of the proposed model is demonstrated by solving instances taken from a real-life pharmaceutical company. Computational results showed that the determined super-solutions are near-optimal.

Keywords: Constraint programming, super-solutions, robust scheduling, batch process, pharmaceutical industries.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1975
4735 The Effect of Eight Weeks of Aerobic Training on Indices of Cardio-Respiratory and Exercise Tolerance in Overweight Women with Chronic Asthma

Authors: Somayeh Negahdari, Mohsen Ghanbarzadeh, Masoud Nikbakht, Heshmatolah Tavakol

Abstract:

Asthma, obesity and overweight are the main factors causing change within the heart and respiratory airways. Asthma symptoms are normally observed during exercising. Epidemiological studies have indicated asthma symptoms occurring due to certain lifestyle habits; for example, a sedentary lifestyle. In this study, eight weeks of aerobic exercises resulted in a positive effect overall in overweight women experiencing mild chronic asthma. The quasi-experimental applied research has been done based on experimental and control groups. The experimental group (seven patients) and control group (n = 7) were graded before and after the test. According to the Borg dyspnea and fatigue Perception Index, the training intensity has determined. Participants in the study performed a sub-maximal aerobic activity schedule (45% to 80% of maximum heart rate) for two months, while the control group (n = 7) stayed away from aerobic exercise. Data evaluation and analysis of covariance compared both the pre-test and post-test with paired t-test at significance level of P≤ 0.05. After eight weeks of exercise, the results of the experimental group show a significant decrease in resting heart rate, systolic blood pressure, minute ventilation, while a significant increase in maximal oxygen uptake and tolerance activity (P ≤ 0.05). In the control group, there was no significant difference in these parameters ((P ≤ 0.05). The results indicate the aerobic activity can strengthen the respiratory muscles, while other physiological factors could result in breathing and heart recovery. Aerobic activity also resulted in favorable changes in cardiovascular parameters, and exercise tolerance of overweight women with chronic asthma.

Keywords: Asthma, respiratory cardiac index, exercise tolerance, aerobic, overweight.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 768
4734 Development of an Avionics System for Flight Data Collection of an UAV Helicopter

Authors: Nikhil Ramaswamy, S.N.Omkar, Kashyap.H.Nathwani, Anil.M.Vanjare

Abstract:

In this present work, the development of an avionics system for flight data collection of a Raptor 30 V2 is carried out. For the data acquisition both onground and onboard avionics systems are developed for testing of a small-scale Unmanned Aerial Vehicle (UAV) helicopter. The onboard avionics record the helicopter state outputs namely accelerations, angular rates and Euler angles, in real time, and the on ground avionics system record the inputs given to the radio controlled helicopter through a transmitter, in real time. The avionic systems are designed and developed taking into consideration low weight, small size, anti-vibration, low power consumption, and easy interfacing. To mitigate the medium frequency vibrations embedded on the UAV helicopter during flight, a damper is designed and its performance is evaluated. A number of flight tests are carried out and the data obtained is then analyzed for accuracy and repeatability and conclusions are inferred.

Keywords: Data collection, Flight Testing, Onground and Onboard Avionics, UAV helicopter

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2642
4733 Effect of Time-Periodic Boundary Temperature on the Onset of Nanofluid Convection in a Layer of a Saturated Porous Medium

Authors: J.C. Umavathi

Abstract:

The linear stability of nanofluid convection in a horizontal porous layer is examined theoretically when the walls of the porous layer are subjected to time-periodic temperature modulation. The model used for the nanofluid incorporates the effects of Brownian motion and thermopherosis, while the Darcy model is used for the porous medium. The analysis revels that for a typical nanofluid (with large Lewis number) the prime effect of the nanofluids is via a buoyancy effect coupled with the conservation of nanoparticles. The contribution of nanoparticles to the thermal energy equation being a second-order effect. It is found that the critical thermal Rayleigh number can be found reduced or decreased by a substantial amount, depending on whether the basic nanoparticle distribution is top-heavy or bottom-heavy. Oscillatory instability is possible in the case of a bottom-heavy nanoparticle distribution, phase angle and frequency of modulation.

Keywords: Brownian motion and thermophoresis, Porous medium, Nanofluid, Natural convection, Thermal modulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2170
4732 Implementation of Adder-Subtracter Design with VerilogHDL

Authors: May Phyo Thwal, Khin Htay Kyi, Kyaw Swar Soe

Abstract:

According to the density of the chips, designers are trying to put so any facilities of computational and storage on single chips. Along with the complexity of computational and storage circuits, the designing, testing and debugging become more and more complex and expensive. So, hardware design will be built by using very high speed hardware description language, which is more efficient and cost effective. This paper will focus on the implementation of 32-bit ALU design based on Verilog hardware description language. Adder and subtracter operate correctly on both unsigned and positive numbers. In ALU, addition takes most of the time if it uses the ripple-carry adder. The general strategy for designing fast adders is to reduce the time required to form carry signals. Adders that use this principle are called carry look- ahead adder. The carry look-ahead adder is to be designed with combination of 4-bit adders. The syntax of Verilog HDL is similar to the C programming language. This paper proposes a unified approach to ALU design in which both simulation and formal verification can co-exist.

Keywords: Addition, arithmetic logic unit, carry look-ahead adder, Verilog HDL.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8926
4731 A Ground Observation Based Climatology of Winter Fog: Study over the Indo-Gangetic Plains, India

Authors: Sanjay Kumar Srivastava, Anu Rani Sharma, Kamna Sachdeva

Abstract:

Every year, fog formation over the Indo-Gangetic Plains (IGPs) of Indian region during the winter months of December and January is believed to create numerous hazards, inconvenience, and economic loss to the inhabitants of this densely populated region of Indian subcontinent. The aim of the paper is to analyze the spatial and temporal variability of winter fog over IGPs. Long term ground observations of visibility and other meteorological parameters (1971-2010) have been analyzed to understand the formation of fog phenomena and its relevance during the peak winter months of January and December over IGP of India. In order to examine the temporal variability, time series and trend analysis were carried out by using the Mann-Kendall Statistical test. Trend analysis performed by using the Mann-Kendall test, accepts the alternate hypothesis with 95% confidence level indicating that there exists a trend. Kendall tau’s statistics showed that there exists a positive correlation between time series and fog frequency. Further, the Theil and Sen’s median slope estimate showed that the magnitude of trend is positive. Magnitude is higher during January compared to December for the entire IGP except in December when it is high over the western IGP. Decade wise time series analysis revealed that there has been continuous increase in fog days. The net overall increase of 99 % was observed over IGP in last four decades. Diurnal variability and average daily persistence were computed by using descriptive statistical techniques. Geo-statistical analysis of fog was carried out to understand the spatial variability of fog. Geo-statistical analysis of fog revealed that IGP is a high fog prone zone with fog occurrence frequency of more than 66% days during the study period. Diurnal variability indicates the peak occurrence of fog is between 06:00 and 10:00 local time and average daily fog persistence extends to 5 to 7 hours during the peak winter season. The results would offer a new perspective to take proactive measures in reducing the irreparable damage that could be caused due to changing trends of fog.

Keywords: Fog, climatology, Mann-Kendall test, trend analysis, spatial variability, temporal variability, visibility.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1755
4730 Performance Analysis of Cluster Based Dual Tired Network Model with INTK Security Scheme in a Wireless Sensor Network

Authors: D. Satish Kumar, S. Karthik

Abstract:

A dual tiered network model is designed to overcome the problem of energy alert and fault tolerance. This model minimizes the delay time and overcome failure of links. Performance analysis of the dual tiered network model is studied in this paper where the CA and LS schemes are compared with DEO optimal. We then evaluate  the Integrated Network Topological Control and Key Management (INTK) Schemes, which was proposed to add security features of the wireless sensor networks. Clustering efficiency, level of protections, the time complexity is some of the parameters of INTK scheme that were analyzed. We then evaluate the Cluster based Energy Competent n-coverage scheme (CEC n-coverage scheme) to ensure area coverage for wireless sensor networks.

Keywords: CEC n-coverage scheme, Clustering efficiency, Dual tired network, Wireless sensor networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1673
4729 Transient Free Laminar Convection in the Vicinity of a Thermal Conductive Vertical Plate

Authors: Anna Bykalyuk, Frédéric Kuznik, Kévyn Johannes

Abstract:

In this paper the influence of a vertical plate’s thermal capacity is numerically investigated in order to evaluate the evolution of the thermal boundary layer structure, as well as the convective heat transfer coefficient and the velocity and temperature profiles. Whereas the heat flux of the heated vertical plate is evaluated under time depending boundary conditions. The main important feature of this problem is the unsteadiness of the physical phenomena. A 2D CFD model is developed with the Ansys Fluent 14.0 environment and is validated using unsteady data obtained for plasterboard studied under a dynamic temperature evolution. All the phenomena produced in the vicinity of the thermal conductive vertical plate (plasterboard) are analyzed and discussed. This work is the first stage of a holistic research on transient free convection that aims, in the future, to study the natural convection in the vicinity of a vertical plate containing Phase Change Materials (PCM).

Keywords: CFD modeling, natural convection, thermal conductive plate, time-depending boundary conditions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2154
4728 Inferences on Compound Rayleigh Parameters with Progressively Type-II Censored Samples

Authors: Abdullah Y. Al-Hossain

Abstract:

This paper considers inference under progressive type II censoring with a compound Rayleigh failure time distribution. The maximum likelihood (ML), and Bayes methods are used for estimating the unknown parameters as well as some lifetime parameters, namely reliability and hazard functions. We obtained Bayes estimators using the conjugate priors for two shape and scale parameters. When the two parameters are unknown, the closed-form expressions of the Bayes estimators cannot be obtained. We use Lindley.s approximation to compute the Bayes estimates. Another Bayes estimator has been obtained based on continuous-discrete joint prior for the unknown parameters. An example with the real data is discussed to illustrate the proposed method. Finally, we made comparisons between these estimators and the maximum likelihood estimators using a Monte Carlo simulation study.

Keywords: Progressive type II censoring, compound Rayleigh failure time distribution, maximum likelihood estimation, Bayes estimation, Lindley's approximation method, Monte Carlo simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2390
4727 Self-Organizing Map Network for Wheeled Robot Movement Optimization

Authors: Boguslaw Schreyer

Abstract:

The paper investigates the application of the Kohonen’s Self-Organizing Map (SOM) to the wheeled robot starting and braking dynamic states. In securing wheeled robot stability as well as minimum starting and braking time, it is important to ensure correct torque distribution as well as proper slope of braking and driving moments. In this paper, a correct movement distribution has been formulated, securing optimum adhesion coefficient and good transversal stability of a wheeled robot. A neural tuner has been proposed to secure the above properties, although most of the attention is attached to the SOM network application. If the delay of the torque application or torque release is not negligible, it is important to change the rising and falling slopes of the torque. The road/surface condition is also paramount in robot dynamic states control. As the road conditions may randomly change in time, application of the SOM network has been suggested in order to classify the actual road conditions.

Keywords: SOM network, torque distribution, torque slope, wheeled robots.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 591
4726 The Recreation Technique Model from the Perspective of Environmental Quality Elements

Authors: G. Gradinaru, S. Olteanu

Abstract:

The quality improvements of the environmental elements could increase the recreational opportunities in a certain area (destination). The technique of the need for recreation focuses on choosing certain destinations for recreational purposes. The basic exchange taken into consideration is the one between the satisfaction gained after staying in that area and the value expressed in money and time allocated. The number of tourists in the respective area, the duration of staying and the money spent including transportation provide information on how individuals rank the place or certain aspects of the area (such as the quality of the environmental elements). For the statistical analysis of the environmental benefits offered by an area through the need of recreation technique, the following stages are suggested: - characterization of the reference area based on the statistical variables considered; - estimation of the environmental benefit through comparing the reference area with other similar areas (having the same environmental characteristics), from the perspective of the statistical variables considered. The model compared in recreation technique faced with a series of difficulties which refers to the reference area and correct transformation of time in money.

Keywords: Comparison in recreation technique, the quality of the environmental elements, statistical analysis model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1089
4725 Benchmarking of Pentesting Tools

Authors: Esteban Alejandro Armas Vega, Ana Lucila Sandoval Orozco, Luis Javier García Villalba

Abstract:

The benchmarking of tools for dynamic analysis of vulnerabilities in web applications is something that is done periodically, because these tools from time to time update their knowledge base and search algorithms, in order to improve their accuracy. Unfortunately, the vast majority of these evaluations are made by software enthusiasts who publish their results on blogs or on non-academic websites and always with the same evaluation methodology. Similarly, academics who have carried out this type of analysis from a scientific approach, the majority, make their analysis within the same methodology as well the empirical authors. This paper is based on the interest of finding answers to questions that many users of this type of tools have been asking over the years, such as, to know if the tool truly test and evaluate every vulnerability that it ensures do, or if the tool, really, deliver a real report of all the vulnerabilities tested and exploited. This kind of questions have also motivated previous work but without real answers. The aim of this paper is to show results that truly answer, at least on the tested tools, all those unanswered questions. All the results have been obtained by changing the common model of benchmarking used for all those previous works.

Keywords: Cybersecurity, IDS, security, web scanners, web vulnerabilities.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1805