Search results for: Vortex element method
4958 Perfect Plastic Deformation of a Circular Thin Bronze Plate due to the Growth and Collapse of a Vapour Bubble
Authors: M.T. Shervani-Tabar, M. Rezaee, E. Madadi Kandjani
Abstract:
Dynamics of a vapour bubble generated due to a high local energy input near a circular thin bronze plate in the absence of the buoyancy forces is numerically investigated in this paper. The bubble is generated near a thin bronze plate and during the growth and collapse of the bubble, it deforms the nearby plate. The Boundary Integral Equation Method is employed for numerical simulation of the problem. The fluid is assumed to be incompressible, irrotational and inviscid and the surface tension on the bubble boundary is neglected. Therefore the fluid flow around the vapour bubble can be assumed as a potential flow. Furthermore, the thin bronze plate is assumed to have perfectly plastic behaviour. Results show that the displacement of the circular thin bronze plate has considerable effect on the dynamics of its nearby vapour bubble. It is found that by decreasing the thickness of the thin bronze plate, the growth and collapse rate of the bubble becomes higher and consequently the lifetime of the bubble becomes shorter.
Keywords: Vapour Bubble, Thin Bronze Plate, Boundary Integral Equation Method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15254957 Ontology Population via NLP Techniques in Risk Management
Authors: Jawad Makki, Anne-Marie Alquier, Violaine Prince
Abstract:
In this paper we propose an NLP-based method for Ontology Population from texts and apply it to semi automatic instantiate a Generic Knowledge Base (Generic Domain Ontology) in the risk management domain. The approach is semi-automatic and uses a domain expert intervention for validation. The proposed approach relies on a set of Instances Recognition Rules based on syntactic structures, and on the predicative power of verbs in the instantiation process. It is not domain dependent since it heavily relies on linguistic knowledge. A description of an experiment performed on a part of the ontology of the PRIMA1 project (supported by the European community) is given. A first validation of the method is done by populating this ontology with Chemical Fact Sheets from Environmental Protection Agency2. The results of this experiment complete the paper and support the hypothesis that relying on the predicative power of verbs in the instantiation process improves the performance.Keywords: Information Extraction, Instance Recognition Rules, Ontology Population, Risk Management, Semantic analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15344956 Independent Design of Multi-loop PI/PID Controllers for Multi-delay Processes
Authors: Truong Nguyen Luan Vu, Moonyong Lee
Abstract:
The interactions between input/output variables are a very common phenomenon encountered in the design of multi-loop controllers for interacting multivariable processes, which can be a serious obstacle for achieving a good overall performance of multiloop control system. To overcome this impediment, the decomposed dynamic interaction analysis is proposed by decomposing the multiloop control system into a set of n independent SISO systems with the corresponding effective open-loop transfer function (EOTF) within the dynamic interactions embedded explicitly. For each EOTF, the reduced model is independently formulated by using the proposed reduction design strategy, and then the paired multi-loop proportional-integral-derivative (PID) controller is derived quite simply and straightforwardly by using internal model control (IMC) theory. This design method can easily be implemented for various industrial processes because of its effectiveness. Several case studies are considered to demonstrate the superior of the proposed method.
Keywords: Multi-loop PID controller, internal model control(IMC), effective open-loop transfer function (EOTF)
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20054955 Optimization of Design Parameters for Wire Mesh Fin Arrays as a Heat Sink Using Taguchi Method
Authors: Kavita H. Dhanawade, Hanamant S. Dhanawade
Abstract:
Heat transfer enhancement objects like extended surfaces, fins etc. are chosen for their thermal performance as well as for other design parameters depending on various applications. The present paper is on experimental study to investigate the heat transfer enhancement through wire mesh fin arrays equipped with horizontal base plate. The data used in performance analysis were obtained experimentally for the material (mild steel) for different heat inputs such as 40, 60, 80, 100 and 120 watt, by varying wire mesh diameter, fin height and spacing between two fin arrays. Using the Taguchi experimental design method, optimum design parameters and their levels were investigated. Average heat transfer coefficient was considered as a performance characteristic parameter. An L9 (33) orthogonal array was selected as an experimental plan. Optimum results were found by experimenting. It is observed that the wire mesh diameter and fin height have a higher impact on heat transfer coefficient as compared to spacing between two fin arrays.Keywords: Heat transfer enhancement, finned surface, wire mesh diameter, natural convection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8124954 Investigation of Rehabilitation Effects on Fire Damaged High Strength Concrete Beams
Authors: Eun Mi Ryu, Ah Young An, Ji Yeon Kang, Yeong Soo Shin, Hee Sun Kim
Abstract:
When high strength reinforced concrete is exposed to high temperature due to a fire, deteriorations occur such as loss in strength and elastic modulus, cracking and spalling of the concrete. Therefore, it is important to understand risk of structural safety in building structures by studying structural behaviors and rehabilitation of fire damaged high strength concrete structures. This paper aims at investigating rehabilitation effect on fire damaged high strength concrete beams using experimental and analytical methods. In the experiments, flexural specimens with high strength concrete are exposed to high temperatures according to ISO 834 standard time temperature curve. From four-point loading test, results show that maximum loads of the rehabilitated beams are similar to or higher than those of the non-fire damaged RC beam. In addition, structural analyses are performed using ABAQUS 6.10-3 with same conditions as experiments to provide accurate predictions on structural and mechanical behaviors of rehabilitated RC beams. The parameters are the fire cover thickness and strengths of repairing mortar. Analytical results show good rehabilitation effects, when the results predicted from the rehabilitated models are compared to structural behaviors of the non-damaged RC beams. In this study, fire damaged high strength concrete beams are rehabilitated using polymeric cement mortar. The predictions from the finite element (FE) models show good agreements with the experimental results and the modeling approaches can be used to investigate applicability of various rehabilitation methods for further study.Keywords: Fire, High strength concrete, Rehabilitation, Reinforced concrete beam.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23744953 Internal Force State Recognition of Jiujiang Bridge Based on Cable Force-displacement Relationship
Authors: Weifeng Wang, Guoqing Huang, Xianwei Zeng
Abstract:
The nearly 21-year-old Jiujiang Bridge, which is suffering from uneven line shape, constant great downwarping of the main beam and cracking of the box girder, needs reinforcement and cable adjustment. It has undergone cable adjustment for twice with incomplete data. Therefore, the initial internal force state of the Jiujiang Bridge is identified as the key for the cable adjustment project. Based on parameter identification by means of static force test data, this paper suggests determining the initial internal force state of the cable-stayed bridge according to the cable force-displacement relationship parameter identification method. That is, upon measuring the displacement and the change in cable forces for twice, one can identify the parameters concerned by means of optimization. This method is applied to the cable adjustment, replacement and reinforcement project for the Jiujiang Bridge as a guidance for the cable adjustment and reinforcement project of the bridge.
Keywords: Cable-stayed bridge, cable force-displacement, parameter identification, internal force state
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15414952 Underlying Cognitive Complexity Measure Computation with Combinatorial Rules
Authors: Benjapol Auprasert, Yachai Limpiyakorn
Abstract:
Measuring the complexity of software has been an insoluble problem in software engineering. Complexity measures can be used to predict critical information about testability, reliability, and maintainability of software systems from automatic analysis of the source code. During the past few years, many complexity measures have been invented based on the emerging Cognitive Informatics discipline. These software complexity measures, including cognitive functional size, lend themselves to the approach of the total cognitive weights of basic control structures such as loops and branches. This paper shows that the current existing calculation method can generate different results that are algebraically equivalence. However, analysis of the combinatorial meanings of this calculation method shows significant flaw of the measure, which also explains why it does not satisfy Weyuker's properties. Based on the findings, improvement directions, such as measures fusion, and cumulative variable counting scheme are suggested to enhance the effectiveness of cognitive complexity measures.Keywords: Cognitive Complexity Measure, Cognitive Weight of Basic Control Structure, Counting Rules, Cumulative Variable Counting Scheme.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18924951 Study on Robot Trajectory Planning by Robot End-Effector Using Dual Curvature Theory of the Ruled Surface
Authors: Y. S. Oh, P. Abhishesh, B. S. Ryuh
Abstract:
This paper presents the method of trajectory planning by the robot end-effector which accounts for more accurate and smooth differential geometry of the ruled surface generated by tool line fixed with end-effector based on the methods of curvature theory of ruled surface and the dual curvature theory, and focuses on the underlying relation to unite them for enhancing the efficiency for trajectory planning. Robot motion can be represented as motion properties of the ruled surface generated by trajectory of the Tool Center Point (TCP). The linear and angular properties of the six degree-of-freedom motion of end-effector are computed using the explicit formulas and functions from curvature theory and dual curvature theory. This paper explains the complete dualization of ruled surface and shows that the linear and angular motion applied using the method of dual curvature theory is more accurate and less complex.
Keywords: Dual curvature theory, robot end effector, ruled surface, TCP, tool center point.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13494950 Semantic Spatial Objects Data Structure for Spatial Access Method
Authors: Kalum Priyanath Udagepola, Zuo Decheng, Wu Zhibo, Yang Xiaozong
Abstract:
Modern spatial database management systems require a unique Spatial Access Method (SAM) in order solve complex spatial quires efficiently. In this case the spatial data structure takes a prominent place in the SAM. Inadequate data structure leads forming poor algorithmic choices and forging deficient understandings of algorithm behavior on the spatial database. A key step in developing a better semantic spatial object data structure is to quantify the performance effects of semantic and outlier detections that are not reflected in the previous tree structures (R-Tree and its variants). This paper explores a novel SSRO-Tree on SAM to the Topo-Semantic approach. The paper shows how to identify and handle the semantic spatial objects with outlier objects during page overflow/underflow, using gain/loss metrics. We introduce a new SSRO-Tree algorithm which facilitates the achievement of better performance in practice over algorithms that are superior in the R*-Tree and RO-Tree by considering selection queries.
Keywords: Outlier, semantic spatial object, spatial objects, SSRO-Tree, topo-semantic.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16934949 Specific Emitter Identification Based on Refined Composite Multiscale Dispersion Entropy
Authors: Shaoying Guo, Yanyun Xu, Meng Zhang, Weiqing Huang
Abstract:
The wireless communication network is developing rapidly, thus the wireless security becomes more and more important. Specific emitter identification (SEI) is an vital part of wireless communication security as a technique to identify the unique transmitters. In this paper, a SEI method based on multiscale dispersion entropy (MDE) and refined composite multiscale dispersion entropy (RCMDE) is proposed. The algorithms of MDE and RCMDE are used to extract features for identification of five wireless devices and cross-validation support vector machine (CV-SVM) is used as the classifier. The experimental results show that the total identification accuracy is 99.3%, even at low signal-to-noise ratio(SNR) of 5dB, which proves that MDE and RCMDE can describe the communication signal series well. In addition, compared with other methods, the proposed method is effective and provides better accuracy and stability for SEI.Keywords: Cross-validation support vector machine, refined composite multiscale dispersion entropy, specific emitter identification, transient signal, wireless communication device.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8564948 Expanding Affordable Housing through Inclusionary Zoning in the City of Toronto
Authors: Sam Moshaver
Abstract:
Reasonably priced and well-constructed housing must be an integral and element supporting a healthy society. The absence of housing everyone in society can afford negatively affects the people's health, education, ability to get jobs, develop their community. Without access to decent housing, economic development, integration of immigrants and inclusiveness, the society is negatively impacted. Canada has a sterling record in creating housing compared to many other nations around the globe. Canadian housing gets support from a mature and responsive mortgage network and a top-quality construction industry as well as safe and excellent quality building materials that are readily available. Yet 1.7 million Canadian households occupy substandard abodes. During the past hundred years, Canada's government has made a wide variety of attempts to provide decent residential facilities every Canadian can afford. Despite these laudable efforts, today Canada is left with housing that is inadequate for many Canadians. People who own their housing are given all kinds of privileges and perks, while people with relatively low incomes who rent their apartments or houses are discriminated against. To help solve these problems, zoning that is based on an "inclusionary" philosophy is tool developed to help provide people the affordable residences that they need. No, thirty years after its introduction, this type of zoning has been shown effective in helping build and provide Canadians with a houses or apartments they can afford to pay for. Using this form of zoning can have different results +depending on where and how it is used. After examining Canadian affordable housing and four American cases where this type of zoning was enforced in the USA, this makes various recommendations for expanding Canadians' access to housing they can afford.Keywords: Affordable Housing, Inclusionary Zoning Low- Income Housing, Toronto Housing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20644947 Investigations into Effect of Neural Network Predictive Control of UPFC for Improving Transient Stability Performance of Multimachine Power System
Authors: Sheela Tiwari, R. Naresh, R. Jha
Abstract:
The paper presents an investigation in to the effect of neural network predictive control of UPFC on the transient stability performance of a multimachine power system. The proposed controller consists of a neural network model of the test system. This model is used to predict the future control inputs using the damped Gauss-Newton method which employs ‘backtracking’ as the line search method for step selection. The benchmark 2 area, 4 machine system that mimics the behavior of large power systems is taken as the test system for the study and is subjected to three phase short circuit faults at different locations over a wide range of operating conditions. The simulation results clearly establish the robustness of the proposed controller to the fault location, an increase in the critical clearing time for the circuit breakers, and an improved damping of the power oscillations as compared to the conventional PI controller.
Keywords: Identification, Neural networks, Predictive control, Transient stability, UPFC.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20784946 A Comparative Study on the Impact of Global Warming of Applying Low Carbon Factor Concrete Products
Authors: Su-Hyun Cho, Chang-U Chae
Abstract:
Environmental impact assessment techniques have been developed as a result of the worldwide efforts to reduce the environmental impact of global warming. By using the quantification method in the construction industry, it is now possible to manage the greenhouse gas is to systematically evaluate the impact on the environment over the entire construction process. In particular, the proportion of greenhouse gas emissions at the production stage of construction material occupied is high, and efforts are needed in particular in the construction field. In this research, intended for concrete products for the construction materials, by using the LCA method, we compared the results of environmental impact assessment and carbon emissions of developing products that have been applied low-carbon technologies compared to existing products. As a results, by introducing a raw material of industrial waste, showed carbon reduction. Through a comparison of the carbon emission reduction effect of low carbon technologies, it is intended to provide academic data for the evaluation of greenhouse gases in the construction sector and the development of low carbon technologies of the future.
Keywords: CO2 Emissions, CO2 Reduction, Ready-mixed Concrete, Environmental Impact Assessment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20574945 Mounting Time Reduction using Content-Based Block Management for NAND Flash File System
Authors: Won-Hee Cho, GeunHyung Lee, Deok-Hwan Kim
Abstract:
The flash memory has many advantages such as low power consumption, strong shock resistance, fast I/O and non-volatility. And it is increasingly used in the mobile storage device. The YAFFS, one of the NAND flash file system, is widely used in the embedded device. However, the existing YAFFS takes long time to mount the file system because it scans whole spare areas in all pages of NAND flash memory. In order to solve this problem, we propose a new content-based flash file system using a mounting time reduction technique. The proposed method only scans partial spare areas of some special pages by using content-based block management. The experimental results show that the proposed method reduces the average mounting time by 87.2% comparing with JFFS2 and 69.9% comparing with YAFFS.
Keywords: NAND Flash Memory, Mounting Time, YAFFS, JFFS2, Content-based Block management
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16784944 Word Base Line Detection in Handwritten Text Recognition Systems
Authors: Kamil R. Aida-zade, Jamaladdin Z. Hasanov
Abstract:
An approach is offered for more precise definition of base lines- borders in handwritten cursive text and general problems of handwritten text segmentation have also been analyzed. An offered method tries to solve problems arose in handwritten recognition with specific slant or in other words, where the letters of the words are not on the same vertical line. As an informative features, some recognition systems use ascending and descending parts of the letters, found after the word-s baseline detection. In such recognition systems, problems in baseline detection, impacts the quality of the recognition and decreases the rate of the recognition. Despite other methods, here borders are found by small pieces containing segmentation elements and defined as a set of linear functions. In this method, separate borders for top and bottom border lines are found. At the end of the paper, as a result, azerbaijani cursive handwritten texts written in Latin alphabet by different authors has been analyzed.
Keywords: Azeri, azerbaijani, latin, segmentation, cursive, HWR, handwritten, recognition, baseline, ascender, descender, symbols.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24784943 Fuzzy Analytic Hierarchy Process for Determination of Supply Chain Performance Evaluation Criteria
Authors: Ibrahim Cil, Onur Kurtcu, H. Ibrahim Demir, Furkan Yener, Yusuf. S. Turkan, Muharrem Unver, Ramazan Evren
Abstract:
Fuzzy AHP (Analytic Hierarchy Process) method is decision-making way at the end of integrating the current AHP method with fuzzy structure. In this study, the processes of production planning, inventory management and purchasing department of a system were analysed and were requested to decide the performance criteria of each area. At this point, the current work processes were analysed by various decision-makers and comparing each criteria by giving points according to 1-9 scale were completed. The criteria were listed in order to their weights by using Fuzzy AHP approach and top three performance criteria of each department were determined. After that, the performance criteria of supply chain consisting of three departments were asked to determine. The processes of each department were compared by decision-makers at the point of building the supply chain performance system and getting the performance criteria. According to the results, the criteria of performance system of supply chain by using Fuzzy AHP were determined for which will be used in the supply chain performance system in the future.
Keywords: AHP, fuzzy, performance evaluation, supply chain.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10564942 Harmonic Pollution Control of the Electrical Network by Three-Phase Shunt Active Filter: Comparative Study of Controls, by Hysteresis and by Duty Cycle Modulation
Authors: T. Patrice Nna Nna, S. Ndjakomo Essiane, S. Pérabi Ngoffé, F. Amigue Fissou
Abstract:
This paper deals with the harmonic decontamination of current in an electrical grid by an active shunt filter in order to improve power quality. The contribution of this paper is mainly based on the proposal of a control strategy for an active filter based on Duty Cycle Modulation (DCM). First, three-monophase method is applied for the identification of disturbing currents. A Simulink model of this method is given for one phase of the grid. Secondly, two orders were designed: the first one is the Hysteresis Control and the second one is the DCM Control. Finally, a comparative study of the two controls was performed. The results obtained show a significant improvement in the rate of harmonic distortion for both controls. The harmonic distortion for the Hysteresis control is limited by the non-controllability of the switching frequencies of the inverter's switches and reduces the harmonic distortion rate (THD) to 3.12% as opposed to the DCM control which limits the THD to 2.82% which makes it better.Keywords: Harmonic pollution, shunt active filter, hysteresis, Duty Cycle Modulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6314941 Influence of Parameters of Modeling and Data Distribution for Optimal Condition on Locally Weighted Projection Regression Method
Authors: Farhad Asadi, Mohammad Javad Mollakazemi, Aref Ghafouri
Abstract:
Recent research in neural networks science and neuroscience for modeling complex time series data and statistical learning has focused mostly on learning from high input space and signals. Local linear models are a strong choice for modeling local nonlinearity in data series. Locally weighted projection regression is a flexible and powerful algorithm for nonlinear approximation in high dimensional signal spaces. In this paper, different learning scenario of one and two dimensional data series with different distributions are investigated for simulation and further noise is inputted to data distribution for making different disordered distribution in time series data and for evaluation of algorithm in locality prediction of nonlinearity. Then, the performance of this algorithm is simulated and also when the distribution of data is high or when the number of data is less the sensitivity of this approach to data distribution and influence of important parameter of local validity in this algorithm with different data distribution is explained.
Keywords: Local nonlinear estimation, LWPR algorithm, Online training method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16004940 Design of Extremum Seeking Control with PD Accelerator and its Application to Monod and Williams-Otto Models
Authors: Hitoshi Takata, Tomohiro Hachino, Masaki Horai, Kazuo Komatsu
Abstract:
In this paper, we are concerned with the design and its simulation studies of a modified extremum seeking control for nonlinear systems. A standard extremum seeking control has a simple structure, but it takes a long time to reach an optimal operating point. We consider a modification of the standard extremum seeking control which is aimed to reach the optimal operating point more speedily than the standard one. In the modification, PD acceleration term is added before an integrator making a principal control, so that it enables the objects to be regulated to the optimal point smoothly. This proposed method is applied to Monod and Williams-Otto models to investigate its effectiveness. Numerical simulation results show that this modified method can improve the time response to the optimal operating point more speedily than the standard one.Keywords: Extremum seeking control, Monod model, Williams- Otto model, PD acceleration term, Optimal operating point.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15124939 Improved Feature Extraction Technique for Handling Occlusion in Automatic Facial Expression Recognition
Authors: Khadijat T. Bamigbade, Olufade F. W. Onifade
Abstract:
The field of automatic facial expression analysis has been an active research area in the last two decades. Its vast applicability in various domains has drawn so much attention into developing techniques and dataset that mirror real life scenarios. Many techniques such as Local Binary Patterns and its variants (CLBP, LBP-TOP) and lately, deep learning techniques, have been used for facial expression recognition. However, the problem of occlusion has not been sufficiently handled, making their results not applicable in real life situations. This paper develops a simple, yet highly efficient method tagged Local Binary Pattern-Histogram of Gradient (LBP-HOG) with occlusion detection in face image, using a multi-class SVM for Action Unit and in turn expression recognition. Our method was evaluated on three publicly available datasets which are JAFFE, CK, SFEW. Experimental results showed that our approach performed considerably well when compared with state-of-the-art algorithms and gave insight to occlusion detection as a key step to handling expression in wild.
Keywords: Automatic facial expression analysis, local binary pattern, LBP-HOG, occlusion detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7804938 Rheological Properties of Dough and Sensory Quality of Crackers with Dietary Fibers
Authors: Ljubica Dokić, Ivana Nikolić, Dragana Šoronja–Simović, Zita Šereš, Biljana Pajin, Nils Juul, Nikola Maravić
Abstract:
The possibility of application the dietary fibers in production of crackers was observed in this work, as well as their influence on rheological and textural properties on the dough for crackers and influence on sensory properties of obtained crackers. Three different dietary fibers, oat, potato and pea fibers, replaced 10% of wheat flour. Long fermentation process and baking test method were used for crackers production. The changes of dough for crackers were observed by rheological methods of determination the viscoelastic dough properties and by textural measurements. Sensory quality of obtained crackers was described using quantity descriptive method (QDA) by trained members of descriptive panel. Additional analysis of crackers surface was performed by videometer. Based on rheological determination, viscoelastic properties of dough for crackers were reduced by application of dietary fibers. Manipulation of dough with 10% of potato fiber was disabled, thus the recipe modification included increase in water content at 35%. Dough compliance to constant stress for samples with dietary fibers decreased, due to more rigid and stiffer dough consistency compared to control sample. Also, hardness of dough for these samples increased and dough extensibility decreased. Sensory properties of final products, crackers, were reduced compared to control sample. Application of dietary fibers affected mostly hardness, structure and crispness of the crackers. Observed crackers were low marked for flavor and taste, due to influence of fibers specific aroma. The sample with 10% of potato fibers and increased water content was the most adaptable to applied stresses and to production process. Also this sample was close to control sample without dietary fibers by evaluation of sensory properties and by results of videometer method.Keywords: Crackers, dietary fibers, rheology, sensory properties.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24924937 Evolved Bat Algorithm Based Adaptive Fuzzy Sliding Mode Control with LMI Criterion
Authors: P.-W. Tsai, C.-Y. Chen, C.-W. Chen
Abstract:
In this paper, the stability analysis of a GA-Based adaptive fuzzy sliding model controller for a nonlinear system is discussed. First, a nonlinear plant is well-approximated and described with a reference model and a fuzzy model, both involving FLC rules. Then, FLC rules and the consequent parameter are decided on via an Evolved Bat Algorithm (EBA). After this, we guarantee a new tracking performance inequality for the control system. The tracking problem is characterized to solve an eigenvalue problem (EVP). Next, an adaptive fuzzy sliding model controller (AFSMC) is proposed to stabilize the system so as to achieve good control performance. Lyapunov’s direct method can be used to ensure the stability of the nonlinear system. It is shown that the stability analysis can reduce nonlinear systems into a linear matrix inequality (LMI) problem. Finally, a numerical simulation is provided to demonstrate the control methodology.
Keywords: Adaptive fuzzy sliding mode control, Lyapunov direct method, swarm intelligence, evolved bat algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20694936 Motor Imaginary Signal Classification Using Adaptive Recursive Bandpass Filter and Adaptive Autoregressive Models for Brain Machine Interface Designs
Authors: Vickneswaran Jeyabalan, Andrews Samraj, Loo Chu Kiong
Abstract:
The noteworthy point in the advancement of Brain Machine Interface (BMI) research is the ability to accurately extract features of the brain signals and to classify them into targeted control action with the easiest procedures since the expected beneficiaries are of disabled. In this paper, a new feature extraction method using the combination of adaptive band pass filters and adaptive autoregressive (AAR) modelling is proposed and applied to the classification of right and left motor imagery signals extracted from the brain. The introduction of the adaptive bandpass filter improves the characterization process of the autocorrelation functions of the AAR models, as it enhances and strengthens the EEG signal, which is noisy and stochastic in nature. The experimental results on the Graz BCI data set have shown that by implementing the proposed feature extraction method, a LDA and SVM classifier outperforms other AAR approaches of the BCI 2003 competition in terms of the mutual information, the competition criterion, or misclassification rate.
Keywords: Adaptive autoregressive, adaptive bandpass filter, brain machine Interface, EEG, motor imaginary.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29004935 Hypothesis of a Holistic Treatment of Cancer: Crab Method
Authors: Devasis Ghosh
Abstract:
The main hindrance to total cure of cancer is a) the failure to control continued production of cancer cells, b) its sustenance and c) its metastasis. This review study has tried to address this issue of total cancer cure in a more innovative way. A 10-pronged “CRAB METHOD”, a novel holistic scientific approach of Cancer treatment has been hypothesized in this paper. Apart from available Chemotherapy, Radiotherapy and Oncosurgery, (which shall not be discussed here), seven other points of interference and treatment has been suggested, i.e. 1. Efficient stress management. 2. Dampening of ATF3 expression. 3. Selective inhibition of Platelet Activity. 4. Modulation of serotonin production, metabolism and 5HT receptor antagonism. 5. Auxin, its anti-proliferative potential and its modulation. 6. Melatonin supplementation because of its oncostatic properties. 7. HDAC Inhibitors especially valproic acid use due to its apoptotic role in many cancers. If all the above stated seven steps are thoroughly taken care of at the time of initial diagnosis of cancer along with the available treatment modalities of Chemotherapy, Radiotherapy and Oncosurgery, then perhaps, the morbidity and mortality rate of cancer may be greatly reduced.Keywords: ATF3 dampening, auxin modulation, cancer, platelet activation, serotonin, stress, valproic acid.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14434934 Estimation of Systolic and Diastolic Pressure using the Pulse Transit Time
Authors: Soo-young Ye, Gi-Ryon Kim, Dong-Keun Jung, Seong-wan Baik, Gye-rok Jeon
Abstract:
In this paper, algorithm estimating the blood pressure was proposed using the pulse transit time (PTT) as a more convenient method of measuring the blood pressure. After measuring ECG and pressure pulse, and photoplethysmography, the PTT was calculated from the acquired signals. Thereafter, the system to indirectly measure the systolic pressure and the diastolic pressure was composed using the statistic method. In comparison between the blood pressure indirectly measured by proposed algorithm estimating the blood pressure and real blood pressure measured by conventional sphygmomanometer, the systolic pressure indicates the mean error of ±3.24mmHg and the standard deviation of 2.53mmHg, while the diastolic pressure indicates the satisfactory result, that is, the mean error of ±1.80mmHg and the standard deviation of 1.39mmHg. These results are satisfied with the regulation of ANSI/AAMI for certification of sphygmomanometer that real measurement error value should be within the mean error of ±5mmHg and the standard deviation of 8mmHg. These results are suggest the possibility of applying to portable and long time blood pressure monitoring system hereafter.Keywords: Blood pressure, Systolic, Diastolic, Pulse transit time.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 65774933 Neural Networks and Particle Swarm Optimization Based MPPT for Small Wind Power Generator
Authors: Chun-Yao Lee, Yi-Xing Shen, Jung-Cheng Cheng, Yi-Yin Li, Chih-Wen Chang
Abstract:
This paper proposes the method combining artificial neural network (ANN) with particle swarm optimization (PSO) to implement the maximum power point tracking (MPPT) by controlling the rotor speed of the wind generator. First, the measurements of wind speed, rotor speed of wind power generator and output power of wind power generator are applied to train artificial neural network and to estimate the wind speed. Second, the method mentioned above is applied to estimate and control the optimal rotor speed of the wind turbine so as to output the maximum power. Finally, the result reveals that the control system discussed in this paper extracts the maximum output power of wind generator within the short duration even in the conditions of wind speed and load impedance variation.Keywords: Maximum power point tracking, artificial neuralnetwork, particle swarm optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22064932 An Approach to Measure Snow Depth of Winter Accumulation at Basin Scale Using Satellite Data
Authors: M. Geetha Priya, D. Krishnaveni
Abstract:
Snow depth estimation and monitoring studies have been carried out for decades using empirical relationship or extrapolation of point measurements carried out in field. With the development of advanced satellite based remote sensing techniques, a modified approach is proposed in the present study to estimate the winter accumulated snow depth at basin scale. Assessment of snow depth by differencing Digital Elevation Model (DEM) generated at the beginning and end of winter season can be experimented for the region of interest (Himalayan and polar regions) accounting for winter accumulation (solid precipitation). The proposed approach is based on existing geodetic method that is being used for glacier mass balance estimation. Considering the satellite datasets purely acquired during beginning and end of winter season, it is possible to estimate the change in depth or thickness for the snow that is accumulated during the winter as it takes one year for the snow to get transformed into firn (snow that has survived one summer or one-year old snow).
Keywords: Digital elevation model, snow depth, geodetic method, snow cover.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7154931 Comparation Treatment Method for Industrial Tempeh Waste by Constructed Wetland and Activated Sludge
Authors: Imanda H. Pradana, Tillana Adilaviana, Christine Pretty Ballerena
Abstract:
Ever since industrial revolution began, our ecosystem has changed. And indeed, the negatives outweigh the positives. Industrial waste usually released into all kinds of body of water, such as river or sea. Tempeh waste is one example of waste that carries many hazardous and unwanted substances that will affect the surrounding environment. Tempeh is a popular fermented food in Asia which is rich in nutrients and active substances. Tempeh liquid waste- in particular- can cause an air pollution, and if penetrates through the soil, it will contaminates ground-water, making it unavailable for the water to be consumed. Moreover, bacteria will thrive within the polluted water, which often responsible for causing many kinds of diseases. The treatment used for this chemical waste is biological treatment such as constructed wetland and activated sludge. These kinds of treatment are able to reduce both physical and chemical parameters altogether such as temperature, TSS, pH, BOD, COD, NH3-N, NO3-N, and PO4-P. These treatments are implemented before the waste is released into the water. The result is a comparation between constructed wetland and activated sludge, along with determining which method is better suited to reduce the physical and chemical subtances of the waste.Keywords: activated sludge, constructed wetland, waste, watertreatment
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18274930 Investigation of Nickel as a Metal Substitute of Palladium Supported on HBeta Zeolite for Waste Tire Pyrolysis
Authors: Lalita Saeaeh, Sirirat Jitkarnka
Abstract:
Pyrolysis of waste tire is one of alternative technique to produce petrochemicals, such as light olefins, mixed C4, and monoaromatics. Noble metals supported on acid zeolite catalysts were reported as potential catalysts to produce the high valuable products from waste tire pyrolysis. Especially, Pd supported on HBeta gave a high yield of olefins, mixed C4, and mono-aromatics. Due to the high prices of noble metals, the objective of this work was to investigate whether or not a non-noble Ni metal can be used as a substitute of a noble metal, Pd, supported on HBeta as a catalyst for waste tire pyrolysis. Ni metal was selected in this work because Ni has high activity in cracking, isomerization, hydrogenation and the ring opening of hydrocarbons Moreover, Ni is an element in the same group as Pd noble metal, which is VIIIB group, aiming to produce high valuable products similarly obtained from Pd. The amount of Ni was varied as 5, 10, and 20% by weight, for comparison with a fixed 1 wt% Pd, using incipient wetness impregnation. The results showed that as a petrochemical-producing catalyst, 10%Ni/HBeta performed better than 1%Pd/HBeta because it did not only produce the highest yield of olefins and cooking gases, but the yields were also higher than 1%Pd/HBeta. 5%Ni/HBeta can be used as a substitute of 1%Pd/HBeta for similar crude production because its crude contains the similar amounts of naphtha and saturated HCs, although it gave no concentration of light mono-aromatics (C6-C11) in the oil. Additionally, 10%Ni/HBeta that gave high olefins and cooking gases was found to give a fairly high concentration of the light mono-aromatics in the oil.Keywords: Catalytic pyrolysis; Waste tire; Pd; Ni; HBeta
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18484929 Evaluation of the Heating Capability and in vitro Hemolysis of Nanosized MgxMn1-xFe2O4 (x = 0.3 and 0.4) Ferrites Prepared by Sol-gel Method
Authors: Laura Elena De León Prado, Dora Alicia Cortés Hernández, Javier Sánchez
Abstract:
Among the different cancer treatments that are currently used, hyperthermia has a promising potential due to the multiple benefits that are obtained by this technique. In general terms, hyperthermia is a method that takes advantage of the sensitivity of cancer cells to heat, in order to damage or destroy them. Within the different ways of supplying heat to cancer cells and achieve their destruction or damage, the use of magnetic nanoparticles has attracted attention due to the capability of these particles to generate heat under the influence of an external magnetic field. In addition, these nanoparticles have a high surface area and sizes similar or even lower than biological entities, which allow their approaching and interaction with a specific region of interest. The most used magnetic nanoparticles for hyperthermia treatment are those based on iron oxides, mainly magnetite and maghemite, due to their biocompatibility, good magnetic properties and chemical stability. However, in order to fulfill more efficiently the requirements that demand the treatment of magnetic hyperthermia, there have been investigations using ferrites that incorporate different metallic ions, such as Mg, Mn, Co, Ca, Ni, Cu, Li, Gd, etc., in their structure. This paper reports the synthesis of nanosized MgxMn1-xFe2O4 (x = 0.3 and 0.4) ferrites by sol-gel method and their evaluation in terms of heating capability and in vitro hemolysis to determine the potential use of these nanoparticles as thermoseeds for the treatment of cancer by magnetic hyperthermia. It was possible to obtain ferrites with nanometric sizes, a single crystalline phase with an inverse spinel structure and a behavior near to that of superparamagnetic materials. Additionally, at concentrations of 10 mg of magnetic material per mL of water, it was possible to reach a temperature of approximately 45°C, which is within the range of temperatures used for the treatment of hyperthermia. The results of the in vitro hemolysis assay showed that, at the concentrations tested, these nanoparticles are non-hemolytic, as their percentage of hemolysis is close to zero. Therefore, these materials can be used as thermoseeds for the treatment of cancer by magnetic hyperthermia.
Keywords: Ferrites, heating capability, hemolysis, nanoparticles, sol-gel.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 901