Search results for: nonlinear time history analysis
13107 Parametric Analysis in the Electronic Sensor Frequency Adjustment Process
Authors: Rungchat Chompu-Inwai, Akararit Charoenkasemsuk
Abstract:
The use of electronic sensors in the electronics industry has become increasingly popular over the past few years, and it has become a high competition product. The frequency adjustment process is regarded as one of the most important process in the electronic sensor manufacturing process. Due to inaccuracies in the frequency adjustment process, up to 80% waste can be caused due to rework processes; therefore, this study aims to provide a preliminary understanding of the role of parameters used in the frequency adjustment process, and also make suggestions in order to further improve performance. Four parameters are considered in this study: air pressure, dispensing time, vacuum force, and the distance between the needle tip and the product. A full factorial design for experiment 2k was considered to determine those parameters that significantly affect the accuracy of the frequency adjustment process, where a deviation in the frequency after adjustment and the target frequency is expected to be 0 kHz. The experiment was conducted on two levels, using two replications and with five center-points added. In total, 37 experiments were carried out. The results reveal that air pressure and dispensing time significantly affect the frequency adjustment process. The mathematical relationship between these two parameters was formulated, and the optimal parameters for air pressure and dispensing time were found to be 0.45 MPa and 458 ms, respectively. The optimal parameters were examined by carrying out a confirmation experiment in which an average deviation of 0.082 kHz was achieved.Keywords: Design of Experiment, Electronic Sensor, Frequency Adjustment, Parametric Analysis
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 139713106 The Application of Queuing Theory in Multi-Stage Production Lines
Authors: Hani Shafeek, Muhammed Marsudi
Abstract:
The purpose of this work is examining the multiproduct multi-stage in a battery production line. To improve the performances of an assembly production line by determine the efficiency of each workstation. Data collected from every workstation. The data are throughput rate, number of operator, and number of parts that arrive and leaves during part processing. Data for the number of parts that arrives and leaves are collected at least at the amount of ten samples to make the data is possible to be analyzed by Chi-Squared Goodness Test and queuing theory. Measures of this model served as the comparison with the standard data available in the company. Validation of the task time value resulted by comparing it with the task time value based on the company database. Some performance factors for the multi-product multi-stage in a battery production line in this work are shown. The efficiency in each workstation was also shown. Total production time to produce each part can be determined by adding the total task time in each workstation. To reduce the queuing time and increase the efficiency based on the analysis any probably improvement should be done. One probably action is by increasing the number of operators how manually operate this workstation.
Keywords: Production line, manufacturing, performance measurement, queuing theory.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 314613105 Dynamics Analyses of Swing Structure Subject to Rotational Forces
Authors: Buntheng Chhorn, WooYoung Jung
Abstract:
Large-scale swing has been used in entertainment and performance, especially in circus, for a very long time. To increase the safety of this type of structure, a thorough analysis for displacement and bearing stress was performed for an extreme condition where a full cycle swing occurs. Different masses, ranging from 40 kg to 220 kg, and velocities were applied on the swing. Then, based on the solution of differential dynamics equation, swing velocity response to harmonic force was obtained. Moreover, the resistance capacity was estimated based on ACI steel structure design guide. Subsequently, numerical analysis was performed in ABAQUS to obtain the stress on each frame of the swing. Finally, the analysis shows that the expansion of swing structure frame section was required for mass bigger than 150kg.
Keywords: Swing structure, displacement, bearing stress, dynamic loads response, finite element analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 126713104 A New Implementation of PCA for Fast Face Detection
Authors: Hazem M. El-Bakry
Abstract:
Principal Component Analysis (PCA) has many different important applications especially in pattern detection such as face detection / recognition. Therefore, for real time applications, the response time is required to be as small as possible. In this paper, new implementation of PCA for fast face detection is presented. Such new implementation is designed based on cross correlation in the frequency domain between the input image and eigenvectors (weights). Simulation results show that the proposed implementation of PCA is faster than conventional one.Keywords: Fast Face Detection, PCA, Cross Correlation, Frequency Domain
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 179713103 A Retrospective Cohort Study on an Outbreak of Gastroenteritis Linked to a Buffet Lunch Served during a Conference in Accra
Authors: Benjamin Osei Tutu, Sharon Annison
Abstract:
On 21st November, 2016, an outbreak of foodborne illness occurred after a buffet lunch served during a stakeholders’ consultation meeting held in Accra. An investigation was conducted to characterise the affected people, determine the etiologic food, the source of contamination and the etiologic agent and to implement appropriate public health measures to prevent future occurrences. A retrospective cohort study was conducted via telephone interviews, using a structured questionnaire developed from the buffet menu. A case was defined as any person suffering from symptoms of foodborne illness e.g. diarrhoea and/or abdominal cramps after eating food served during the stakeholder consultation meeting in Accra on 21st November, 2016. The exposure status of all the members of the cohort was assessed by taking the food history of each respondent during the telephone interview. The data obtained was analysed using Epi Info 7. An environmental risk assessment was conducted to ascertain the source of the food contamination. Risks of foodborne infection from the foods eaten were determined using attack rates and odds ratios. Data was obtained from 54 people who consumed food served during the stakeholders’ meeting. Out of this population, 44 people reported with symptoms of food poisoning representing 81.45% (overall attack rate). The peak incubation period was seven hours with a minimum and maximum incubation periods of four and 17 hours, respectively. The commonly reported symptoms were diarrhoea (97.73%, 43/44), vomiting (84.09%, 37/44) and abdominal cramps (75.00%, 33/44). From the incubation period, duration of illness and the symptoms, toxin-mediated food poisoning was suspected. The environmental risk assessment of the implicated catering facility indicated a lack of time/temperature control, inadequate knowledge on food safety among workers and sanitation issues. Limited number of food samples was received for microbiological analysis. Multivariate analysis indicated that illness was significantly associated with the consumption of the snacks served (OR 14.78, P < 0.001). No stool and blood or samples of etiologic food were available for organism isolation; however, the suspected etiologic agent was Staphylococcus aureus or Clostridium perfringens. The outbreak could probably be due to the consumption of unwholesome snack (tuna sandwich or chicken. The contamination and/or growth of the etiologic agent in the snack may be due to the breakdown in cleanliness, time/temperature control and good food handling practices. Training of food handlers in basic food hygiene and safety is recommended.
Keywords: Accra, buffet, C. perfringens, cohort study, food poisoning, gastroenteritis, office workers, Staphylococcus aureus.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 97513102 Comparative Study of the Effects of Process Parameters on the Yield of Oil from Melon Seed (Cococynthis citrullus) and Coconut Fruit (Cocos nucifera)
Authors: Ndidi F. Amulu, Patrick E. Amulu, Gordian O. Mbah, Callistus N. Ude
Abstract:
Comparative analysis of the properties of melon seed, coconut fruit and their oil yield were evaluated in this work using standard analytical technique AOAC. The results of the analysis carried out revealed that the moisture contents of the samples studied are 11.15% (melon) and 7.59% (coconut). The crude lipid content are 46.10% (melon) and 55.15% (coconut).The treatment combinations used (leaching time, leaching temperature and solute: solvent ratio) showed significant difference (p < 0.05) in yield between the samples, with melon oil seed flour having a higher percentage range of oil yield (41.30 – 52.90%) and coconut (36.25 – 49.83%). The physical characterization of the extracted oil was also carried out. The values gotten for refractive index are 1.487 (melon seed oil) and 1.361 (coconut oil) and viscosities are 0.008 (melon seed oil) and 0.002 (coconut oil). The chemical analysis of the extracted oils shows acid value of 1.00mg NaOH/g oil (melon oil), 10.050mg NaOH/g oil (coconut oil) and saponification value of 187.00mg/KOH (melon oil) and 183.26mg/KOH (coconut oil). The iodine value of the melon oil gave 75.00mg I2/g and 81.00mg I2/g for coconut oil. A standard statistical package Minitab version 16.0 was used in the regression analysis and analysis of variance (ANOVA). The statistical software mentioned above was also used to optimize the leaching process. Both samples gave high oil yield at the same optimal conditions. The optimal conditions to obtain highest oil yield ≥ 52% (melon seed) and ≥ 48% (coconut seed) are solute - solvent ratio of 40g/ml, leaching time of 2hours and leaching temperature of 50oC. The two samples studied have potential of yielding oil with melon seed giving the higher yield.Keywords: Coconut, melon, optimization, processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 215313101 Stability and HOPF Bifurcation Analysis in a Stage-structured Predator-prey system with Two Time Delays
Authors: Yongkun Li, Meng Hu
Abstract:
A stage-structured predator-prey system with two time delays is considered. By analyzing the corresponding characteristic equation, the local stability of a positive equilibrium is investigated and the existence of Hopf bifurcations is established. Formulae are derived to determine the direction of bifurcations and the stability of bifurcating periodic solutions by using the normal form theory and center manifold theorem. Numerical simulations are carried out to illustrate the theoretical results. Based on the global Hopf bifurcation theorem for general functional differential equations, the global existence of periodic solutions is established.
Keywords: Predator-prey system, stage structure, time delay, HOPF bifurcation, periodic solution, stability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 156913100 A Quantitative Analysis of GSM Air Interface Based on Radiating Columns and Prediction Model
Authors: K. M. Doraiswamy, Lakshminarayana Merugu, B. C. Jinaga
Abstract:
This paper explains the cause of nonlinearity in floor attenuation hither to left unexplained. The performance degradation occurring in air interface for GSM signals is quantitatively analysed using the concept of Radiating Columns of buildings. The signal levels were measured using Wireless Network Optimising Drive Test Tool (E6474A of Agilent Technologies). The measurements were taken in reflected signal environment under usual fading conditions on actual GSM signals radiated from base stations. A mathematical model is derived from the measurements to predict the GSM signal levels in different floors. It was applied on three buildings and found that the predicted signal levels deviated from the measured levels with in +/- 2 dB for all floors. It is more accurate than the prediction models based on Floor Attenuation Factor. It can be used for planning proper indoor coverage in multi storey buildings.Keywords: GSM air interface, nonlinear attenuation, multistory building, radiating columns, ground conduction and floor attenuation factor.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 157213099 Stochastic Optimization of a Vendor-Managed Inventory Problem in a Two-Echelon Supply Chain
Authors: Bita Payami-Shabestari, Dariush Eslami
Abstract:
The purpose of this paper is to develop a multi-product economic production quantity model under vendor management inventory policy and restrictions including limited warehouse space, budget, and number of orders, average shortage time and maximum permissible shortage. Since the “costs” cannot be predicted with certainty, it is assumed that data behave under uncertain environment. The problem is first formulated into the framework of a bi-objective of multi-product economic production quantity model. Then, the problem is solved with three multi-objective decision-making (MODM) methods. Then following this, three methods had been compared on information on the optimal value of the two objective functions and the central processing unit (CPU) time with the statistical analysis method and the multi-attribute decision-making (MADM). The results are compared with statistical analysis method and the MADM. The results of the study demonstrate that augmented-constraint in terms of optimal value of the two objective functions and the CPU time perform better than global criteria, and goal programming. Sensitivity analysis is done to illustrate the effect of parameter variations on the optimal solution. The contribution of this research is the use of random costs data in developing a multi-product economic production quantity model under vendor management inventory policy with several constraints.Keywords: Economic production quantity, random cost, supply chain management, vendor-managed inventory.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 68213098 Modeling Reaction Time in Car-Following Behaviour Based on Human Factors
Authors: Atif Mehmood, Said M. Easa
Abstract:
This paper develops driver reaction-time models for car-following analysis based on human factors. The reaction time was classified as brake-reaction time (BRT) and acceleration/deceleration reaction time (ADRT). The BRT occurs when the lead vehicle is barking and its brake light is on, while the ADRT occurs when the driver reacts to adjust his/her speed using the gas pedal only. The study evaluates the effect of driver characteristics and traffic kinematic conditions on the driver reaction time in a car-following environment. The kinematic conditions introduced urgency and expectancy based on the braking behaviour of the lead vehicle at different speeds and spacing. The kinematic conditions were used for evaluating the BRT and are classified as normal, surprised, and stationary. Data were collected on a driving simulator integrated into a real car and included the BRT and ADRT (as dependent variables) and driver-s age, gender, driving experience, driving intensity (driving hours per week), vehicle speed, and spacing (as independent variables). The results showed that there was a significant difference in the BRT at normal, surprised, and stationary scenarios and supported the hypothesis that both urgency and expectancy had significant effects on BRT. Driver-s age, gender, speed, and spacing were found to be significant variables for the BRT in all scenarios. The results also showed that driver-s age and gender were significant variables for the ADRT. The research presented in this paper is part of a larger project to develop a driversensitive in-vehicle rear-end collision warning system.Keywords: Brake reaction time, car-following, human factors, modeling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 431413097 Methodology for the Multi-Objective Analysis of Data Sets in Freight Delivery
Authors: Dale Dzemydiene, Aurelija Burinskiene, Arunas Miliauskas, Kristina Ciziuniene
Abstract:
Data flow and the purpose of reporting the data are different and dependent on business needs. Different parameters are reported and transferred regularly during freight delivery. This business practices form the dataset constructed for each time point and contain all required information for freight moving decisions. As a significant amount of these data is used for various purposes, an integrating methodological approach must be developed to respond to the indicated problem. The proposed methodology contains several steps: (1) collecting context data sets and data validation; (2) multi-objective analysis for optimizing freight transfer services. For data validation, the study involves Grubbs outliers analysis, particularly for data cleaning and the identification of statistical significance of data reporting event cases. The Grubbs test is often used as it measures one external value at a time exceeding the boundaries of standard normal distribution. In the study area, the test was not widely applied by authors, except when the Grubbs test for outlier detection was used to identify outsiders in fuel consumption data. In the study, the authors applied the method with a confidence level of 99%. For the multi-objective analysis, the authors would like to select the forms of construction of the genetic algorithms, which have more possibilities to extract the best solution. For freight delivery management, the schemas of genetic algorithms' structure are used as a more effective technique. Due to that, the adaptable genetic algorithm is applied for the description of choosing process of the effective transportation corridor. In this study, the multi-objective genetic algorithm methods are used to optimize the data evaluation and select the appropriate transport corridor. The authors suggest a methodology for the multi-objective analysis, which evaluates collected context data sets and uses this evaluation to determine a delivery corridor for freight transfer service in the multi-modal transportation network. In the multi-objective analysis, authors include safety components, the number of accidents a year, and freight delivery time in the multi-modal transportation network. The proposed methodology has practical value in the management of multi-modal transportation processes.
Keywords: Multi-objective decision support, analysis, data validation, freight delivery, multi-modal transportation, genetic programming methods.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 48413096 Shannon-Weaver Biodiversity of Neutrophils in Fractal Networks of Immunofluorescence for Medical Diagnostics
Authors: N.E.Galich
Abstract:
We develop new nonlinear methods of immunofluorescence analysis for a sensitive technology of respiratory burst reaction of DNA fluorescence due to oxidative activity in the peripheral blood neutrophils. Histograms in flow cytometry experiments represent a fluorescence flashes frequency as functions of fluorescence intensity. We used the Shannon-Weaver index for definition of neutrophils- biodiversity and Hurst index for definition of fractal-s correlations in immunofluorescence for different donors, as the basic quantitative criteria for medical diagnostics of health status. We analyze frequencies of flashes, information, Shannon entropies and their fractals in immunofluorescence networks due to reduction of histogram range. We found the number of simplest universal correlations for biodiversity, information and Hurst index in diagnostics and classification of pathologies for wide spectra of diseases. In addition is determined the clear criterion of a common immunity and human health status in a form of yes/no answers type. These answers based on peculiarities of information in immunofluorescence networks and biodiversity of neutrophils. Experimental data analysis has shown the existence of homeostasis for information entropy in oxidative activity of DNA in neutrophil nuclei for all donors.Keywords: blood and cells fluorescence in diagnostics ofdiseases, cytometric histograms, entropy and information in fractalnetworks of oxidative activity of DNA, long-range chromosomalcorrelations in living cells.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 170013095 Effect of Confinement on the Bearing Capacity and Settlement of Spread Foundations
Authors: Tahsin Toma Sabbagh, Ihsan Al-Abboodi, Ali Al-Jazaairry
Abstract:
Allowable-bearing capacity is the competency of soil to safely carries the pressure from the superstructure without experiencing a shear failure with accompanying excessive settlements. Ensuring a safe bearing pressure with respect to failure does not tolerate settlement of the foundation will be within acceptable limits. Therefore, settlement analysis should always be performed since most structures are settlement sensitive. When visualising the movement of a soil wedge in the bearing capacity criterion, both vertically and horizontally, it becomes clear that by confining the soil surrounding the foundation, both the bearing capacity and settlement values improve. In this study, two sizes of spread foundation were considered; (2×4) m and (3×5) m. These represent two real problem case studies of an existing building. The foundations were analysed in terms of dimension as well as position with respect to a confining wall (i.e., sheet piles on both sides). Assuming B is the least foundation dimension, the study comprised the analyses of three distances; (0.1 B), (0.5 B), and (0.75 B) between the sheet piles and foundations alongside three depths of confinement (0.5 B), (1 B), and (1.5 B). Nonlinear three-dimensional finite element analysis (ANSYS) was adopted to perform an analytical investigation on the behaviour of the two foundations contained by the case study. Results showed that confinement of foundations reduced the overall stresses near the foundation by 65% and reduced the vertical displacement by 90%. Moreover, the most effective distance between the confinement wall and the foundation was found to be 0.5 B.
Keywords: Bearing capacity, cohesionless soils, spread footings, soil confinement, soil modelling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 89013094 Time Temperature Dependence of Long Fiber Reinforced Polypropylene Manufactured by Direct Long Fiber Thermoplastic Process
Authors: K. A. Weidenmann, M. Grigo, B. Brylka, P. Elsner, T. Böhlke
Abstract:
In order to reduce fuel consumption, the weight of automobiles has to be reduced. Fiber reinforced polymers offer the potential to reach this aim because of their high stiffness to weight ratio. Additionally, the use of fiber reinforced polymers in automotive applications has to allow for an economic large-scale production. In this regard, long fiber reinforced thermoplastics made by direct processing offer both mechanical performance and processability in injection moulding and compression moulding. The work presented in this contribution deals with long glass fiber reinforced polypropylene directly processed in compression moulding (D-LFT). For the use in automotive applications both the temperature and the time dependency of the materials properties have to be investigated to fulfill performance requirements during crash or the demands of service temperatures ranging from -40 °C to 80 °C. To consider both the influence of temperature and time, quasistatic tensile tests have been carried out at different temperatures. These tests have been complemented by high speed tensile tests at different strain rates. As expected, the increase in strain rate results in an increase of the elastic modulus which correlates to an increase of the stiffness with decreasing service temperature. The results are in good accordance with results determined by dynamic mechanical analysis within the range of 0.1 to 100 Hz. The experimental results from different testing methods were grouped and interpreted by using different time temperature shift approaches. In this regard, Williams-Landel-Ferry and Arrhenius approach based on kinetics have been used. As the theoretical shift factor follows an arctan function, an empirical approach was also taken into consideration. It could be shown that this approach describes best the time and temperature superposition for glass fiber reinforced polypropylene manufactured by D-LFT processing.
Keywords: Composite, long fiber reinforced thermoplastics, mechanical properties, dynamic mechanical analysis, time temperature superposition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 169913093 Analyzing and Formulation of Product Lead Time
Authors: B. Fahimnia, L.H.S. Luong, B. Motevallian, R. M. Marian, M. M. Esmaeil
Abstract:
Product Lead Time (PLT) is the period of time from receiving a customer's order to delivering the final product. PLT is an indicator of the manufacturing controllability, efficiency and performance. Due to the explosion in the rate of technological innovations and the rapid changes in the nature of manufacturing processes, manufacturing firms can bring the new products to market quicker only if they can reduce their PLT and speed up the rate at which they can design, plan, control, and manufacture. Although there is a substantial body of research on manufacturing relating to cost and quality issues, there is no much specific research conducted in relation to the formulation of PLT, despite its significance and importance. This paper analyzes and formulates PLT which can be used as a guideline for achieving the shorter PLT. Further more this paper identifies the causes of delay and factors that contributes to the increased product lead-time.Keywords: Manufacturing Control, Manufacturing Lead Time, Manufacturing Planning, Product Design, and Product Lead Time.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 176413092 End-to-End Pyramid Based Method for MRI Reconstruction
Authors: Omer Cahana, Maya Herman, Ofer Levi
Abstract:
Magnetic Resonance Imaging (MRI) is a lengthy medical scan that stems from a long acquisition time. Its length is mainly due to the traditional sampling theorem, which defines a lower boundary for sampling. However, it is still possible to accelerate the scan by using a different approach such as Compress Sensing (CS) or Parallel Imaging (PI). These two complementary methods can be combined to achieve a faster scan with high-fidelity imaging. To achieve that, two conditions must be satisfied: i) the signal must be sparse under a known transform domain, and ii) the sampling method must be incoherent. In addition, a nonlinear reconstruction algorithm must be applied to recover the signal. While the rapid advances in Deep Learning (DL) have had tremendous successes in various computer vision tasks, the field of MRI reconstruction is still in its early stages. In this paper, we present an end-to-end method for MRI reconstruction from k-space to image. Our method contains two parts. The first is sensitivity map estimation (SME), which is a small yet effective network that can easily be extended to a variable number of coils. The second is reconstruction, which is a top-down architecture with lateral connections developed for building high-level refinement at all scales. Our method holds the state-of-art fastMRI benchmark, which is the largest, most diverse benchmark for MRI reconstruction.
Keywords: Accelerate MRI scans, image reconstruction, pyramid network, deep learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 33413091 Automatic Detection and Spatio-temporal Analysis of Commercial Accumulations Using Digital Yellow Page Data
Authors: Yuki. Akiyama, Hiroaki. Sengoku, Ryosuke. Shibasaki
Abstract:
In this study, the locations and areas of commercial accumulations were detected by using digital yellow page data. An original buffering method that can accurately create polygons of commercial accumulations is proposed in this paper.; by using this method, distribution of commercial accumulations can be easily created and monitored over a wide area. The locations, areas, and time-series changes of commercial accumulations in the South Kanto region can be monitored by integrating polygons of commercial accumulations with the time-series data of digital yellow page data. The circumstances of commercial accumulations were shown to vary according to areas, that is, highly- urbanized regions such as the city center of Tokyo and prefectural capitals, suburban areas near large cities, and suburban and rural areas.Keywords: Commercial accumulations, Spatio-temporal analysis, Urban monitoring, Yellow page data
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 126313090 Experimental Study of CO2 Absorption in Different Blend Solutions as Solvent for CO2 Capture
Authors: Rouzbeh Ramezani, Renzo Di Felice
Abstract:
Nowadays, removal of CO2 as one of the major contributors to global warming using alternative solvents with high CO2 absorption efficiency, is an important industrial operation. In this study, three amines, including 2-methylpiperazine, potassium sarcosinate and potassium lysinate as potential additives, were added to the potassium carbonate solution as a base solvent for CO2 capture. In order to study the absorption performance of CO2 in terms of loading capacity of CO2 and absorption rate, the absorption experiments in a blend of additives with potassium carbonate were carried out using the vapor-liquid equilibrium apparatus at a temperature of 313.15 K, CO2 partial pressures ranging from 0 to 50 kPa and at mole fractions 0.2, 0.3, and 0.4. Furthermore, the performance of CO2 absorption in these blend solutions was compared with pure monoethanolamine and with pure potassium carbonate. Finally, a correlation with good accuracy was developed using the nonlinear regression analysis in order to predict CO2 loading capacity.
Keywords: Absorption rate, carbon dioxide, CO2 capture, global warming, loading capacity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 129913089 Robust Control for Discrete-Time Sector Bounded Systems with Time-Varying Delay
Authors: Ju H. Park, S.M. Lee
Abstract:
In this paper, we propose a robust controller design method for discrete-time systems with sector-bounded nonlinearities and time-varying delay. Based on the Lyapunov theory, delaydependent stabilization criteria are obtained in terms of linear matrix inequalities (LMIs) by constructing the new Lyapunov-Krasovskii functional and using some inequalities. A robust state feedback controller is designed by LMI framework and a reciprocally convex combination technique. The effectiveness of the proposed method is verified throughout a numerical example.
Keywords: Lur'e systems, Time-delay, Stabilization, LMIs.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 168613088 An Analysis of Genetic Algorithm Based Test Data Compression Using Modified PRL Coding
Authors: K. S. Neelukumari, K. B. Jayanthi
Abstract:
In this paper genetic based test data compression is targeted for improving the compression ratio and for reducing the computation time. The genetic algorithm is based on extended pattern run-length coding. The test set contains a large number of X value that can be effectively exploited to improve the test data compression. In this coding method, a reference pattern is set and its compatibility is checked. For this process, a genetic algorithm is proposed to reduce the computation time of encoding algorithm. This coding technique encodes the 2n compatible pattern or the inversely compatible pattern into a single test data segment or multiple test data segment. The experimental result shows that the compression ratio and computation time is reduced.Keywords: Backtracking, test data compression (TDC), x-filling, x-propagating and genetic algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 186913087 Using Analytical Hierarchy Process and TOPSIS Approaches in Designing a Finite Element Analysis Automation Program
Authors: Ming Wen, Nasim Nezamoddini
Abstract:
Sophisticated numerical simulations like finite element analysis (FEA) involve a complicated process from model setup to post-processing tasks that require replication of time-consuming steps. Utilizing FEA automation program simplifies the complexity of the involved steps while minimizing human errors in analysis set up, calculations, and results processing. One of the main challenges in designing FEA automation programs is to identify user requirements and link them to possible design alternatives. This paper presents a decision-making framework to design a Python based FEA automation program for modal analysis, frequency response analysis, and random vibration fatigue (RVF) analysis procedures. Analytical hierarchy process (AHP) and technique for order preference by similarity to ideal solution (TOPSIS) are applied to evaluate design alternatives considering the feedback received from experts and program users.
Keywords: FEA, random vibration fatigue, process automation, AHP, TOPSIS, multiple-criteria decision-making, MCDM.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 53113086 Simultaneous Tuning of Static Var Compensator and Power System Stabilizer Employing Real- Coded Genetic Algorithm
Authors: S. Panda, N. P. Patidar, R. Singh
Abstract:
Power system stability enhancement by simultaneous tuning of a Power System Stabilizer (PSS) and a Static Var Compensator (SVC)-based controller is thoroughly investigated in this paper. The coordination among the proposed damping stabilizers and the SVC internal voltage regulators has also been taken into consideration. The design problem is formulated as an optimization problem with a time-domain simulation-based objective function and Real-Coded Genetic Algorithm (RCGA) is employed to search for optimal controller parameters. The proposed stabilizers are tested on a weakly connected power system with different disturbances and loading conditions. The nonlinear simulation results are presented to show the effectiveness and robustness of the proposed control schemes over a wide range of loading conditions and disturbances. Further, the proposed design approach is found to be robust and improves stability effectively even under small disturbance and unbalanced fault conditions.
Keywords: Real-Coded Genetic Algorithm (RCGA), Static Var Compensator (SVC), Power System Stabilizer (PSS), Low Frequency Oscillations, Power System Stability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 225513085 ψ-exponential Stability for Non-linear Impulsive Differential Equations
Authors: Bhanu Gupta, Sanjay K. Srivastava
Abstract:
In this paper, we shall present sufficient conditions for the ψ-exponential stability of a class of nonlinear impulsive differential equations. We use the Lyapunov method with functions that are not necessarily differentiable. In the last section, we give some examples to support our theoretical results.Keywords: Exponential stability, globally exponential stability, impulsive differential equations, Lyapunov function, ψ-stability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 393513084 Analysis of Electrocardiograph (ECG) Signal for the Detection of Abnormalities Using MATLAB
Authors: Durgesh Kumar Ojha, Monica Subashini
Abstract:
The proposed method is to study and analyze Electrocardiograph (ECG) waveform to detect abnormalities present with reference to P, Q, R and S peaks. The first phase includes the acquisition of real time ECG data. In the next phase, generation of signals followed by pre-processing. Thirdly, the procured ECG signal is subjected to feature extraction. The extracted features detect abnormal peaks present in the waveform Thus the normal and abnormal ECG signal could be differentiated based on the features extracted. The work is implemented in the most familiar multipurpose tool, MATLAB. This software efficiently uses algorithms and techniques for detection of any abnormalities present in the ECG signal. Proper utilization of MATLAB functions (both built-in and user defined) can lead us to work with ECG signals for processing and analysis in real time applications. The simulation would help in improving the accuracy and the hardware could be built conveniently.
Keywords: ECG Waveform, Peak Detection, Arrhythmia, Matlab.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1200813083 Combining the Description Features of UMLRT and CSP+T Specifications Applied to a Complete Design of Real-Time Systems
Authors: Kawtar Benghazi Akhlaki, Manuel I. Capel-Tuñón
Abstract:
UML is a collection of notations for capturing a software system specification. These notations have a specific syntax defined by the Object Management Group (OMG), but many of their constructs only present informal semantics. They are primarily graphical, with textual annotation. The inadequacies of standard UML as a vehicle for complete specification and implementation of real-time embedded systems has led to a variety of competing and complementary proposals. The Real-time UML profile (UML-RT), developed and standardized by OMG, defines a unified framework to express the time, scheduling and performance aspects of a system. We present in this paper a framework approach aimed at deriving a complete specification of a real-time system. Therefore, we combine two methods, a semiformal one, UML-RT, which allows the visual modeling of a realtime system and a formal one, CSP+T, which is a design language including the specification of real-time requirements. As to show the applicability of the approach, a correct design of a real-time system with hard real time constraints by applying a set of mapping rules is obtained.
Keywords: CSP+T, formal software specification, process algebras, real-time systems, unified modeling language.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 180913082 Delay-Dependent H∞ Performance Analysis for Markovian Jump Systems with Time-Varying Delays
Authors: Yucai Ding, Hong Zhu, Shouming Zhong, Yuping Zhang
Abstract:
This paper considers H∞ performance for Markovian jump systems with Time-varying delays. The systems under consideration involve disturbance signal, Markovian switching and timevarying delays. By using a new Lyapunov-Krasovskii functional and a convex optimization approach, a delay-dependent stability condition in terms of linear matrix inequality (LMI) is addressed, which guarantee asymptotical stability in mean square and a prescribed H∞ performance index for the considered systems. Two numerical examples are given to illustrate the effectiveness and the less conservatism of the proposed main results. All these results are expected to be of use in the study of stochastic systems with time-varying delays.
Keywords: H∞ performance, Markovian switching, Delaydependent stability, Linear matrix inequality (LMI)
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 161613081 Embedded Systems Energy Consumption Analysis Through Co-modelling and Simulation
Authors: José Antonio Esparza Isasa, Finn Overgaard Hansen, Peter Gorm Larsen
Abstract:
This paper presents a new methodology to study power and energy consumption in mechatronic systems early in the development process. This new approach makes use of two modeling languages to represent and simulate embedded control software and electromechanical subsystems in the discrete event and continuous time domain respectively within a single co-model. This co-model enables an accurate representation of power and energy consumption and facilitates the analysis and development of both software and electro-mechanical subsystems in parallel. This makes the engineers aware of energy-wise implications of different design alternatives and enables early trade-off analysis from the beginning of the analysis and design activities.
Keywords: Energy consumption, embedded systems, modeldriven engineering, power awareness.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 207413080 Parametric Analysis of Effective Factors on the Seismic Rehabilitation of the Foundations by Network Micropile
Authors: Keivan Abdollahi, Alireza Mortezaei
Abstract:
The main objective of seismic rehabilitation in the foundations is decreasing the range of horizontal and vertical vibrations and omitting high frequencies contents under the seismic loading. In this regard, the advantages of micropiles network is utilized. Reduction in vibration range of foundation can be achieved by using high dynamic rigidness module such as deep foundations. In addition, natural frequency of pile and soil system increases in regard to rising of system rigidness. Accordingly, the main strategy is decreasing of horizontal and vertical seismic vibrations of the structure. In this case, considering the impact of foundation, pile and improved soil foundation is a primary concern. Therefore, in this paper, effective factors are studied on the seismic rehabilitation of foundations applying network micropiles in sandy soils with nonlinear reaction.Keywords: Micropile network, rehabilitation, vibration, seismic load.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 202813079 On the Determination of a Time-like Dual Curve in Dual Lorentzian Space
Authors: Emin Özyılmaz
Abstract:
In this work, position vector of a time-like dual curve according to standard frame of D31 is investigated. First, it is proven that position vector of a time-like dual curve satisfies a dual vector differential equation of fourth order. The general solution of this dual vector differential equation has not yet been found. Due to this, in terms of special solutions, position vectors of some special time-like dual curves with respect to standard frame of D31 are presented.Keywords: Classical Differential Geometry, Dual Numbers, DualFrenet Equations, Time-like Dual Curve, Position Vector, DualLorentzian Space.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 146813078 Effect of Atmospheric Pressure on the Flow at the Outlet of a Propellant Nozzle
Authors: R. Haoui
Abstract:
The purpose of this work is to simulate the flow at the exit of Vulcan 1 engine of European launcher Ariane 5. The geometry of the propellant nozzle is already determined using the characteristics method. The pressure in the outlet section of the nozzle is less than atmospheric pressure on the ground, causing the existence of oblique and normal shock waves at the exit. During the rise of the launcher, the atmospheric pressure decreases and the shock wave disappears. The code allows the capture of shock wave at exit of nozzle. The numerical technique uses the Flux Vector Splitting method of Van Leer to ensure convergence and avoid the calculation instabilities. The Courant, Friedrichs and Lewy coefficient (CFL) and mesh size level are selected to ensure the numerical convergence. The nonlinear partial derivative equations system which governs this flow is solved by an explicit unsteady numerical scheme by the finite volume method. The accuracy of the solution depends on the size of the mesh and also the step of time used in the discretized equations. We have chosen in this study the mesh that gives us a stationary solution with good accuracy.
Keywords: Launchers, supersonic flow, finite volume, nozzles, shock wave.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 877