Search results for: real time data.
9946 Beam and Diffuse Solar Energy in Zarqa City
Authors: Ali M. Jawarneh
Abstract:
Beam and diffuse radiation data are extracted analytically from previous measured data on a horizontal surface in Zarqa city. Moreover, radiation data on a tilted surfaces with different slopes have been derived and analyzed. These data are consisting of of beam contribution, diffuse contribution, and ground reflected contribution radiation. Hourly radiation data for horizontal surface possess the highest radiation values on June, and then the values decay as the slope increases and the sharp decreasing happened for vertical surface. The beam radiation on a horizontal surface owns the highest values comparing to diffuse radiation for all days of June. The total daily radiation on the tilted surface decreases with slopes. The beam radiation data also decays with slopes especially for vertical surface. Diffuse radiation slightly decreases with slopes with sharp decreases for vertical surface. The groundreflected radiation grows with slopes especially for vertical surface. It-s clear that in June the highest harvesting of solar energy occurred for horizontal surface, then the harvesting decreases as the slope increases.
Keywords: Beam and Diffuse Radiation, Zarqa City
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15509945 An Application of the Data Mining Methods with Decision Rule
Authors: Xun Ge, Jianhua Gong
Abstract:
ankings for output of Chinese main agricultural commodity in the world for 1978, 1980, 1990, 2000, 2006, 2007 and 2008 have been released in United Nations FAO Database. Unfortunately, where the ranking of output of Chinese cotton lint in the world for 2008 was missed. This paper uses sequential data mining methods with decision rules filling this gap. This new data mining method will be help to give a further improvement for United Nations FAO Database.
Keywords: Ranking, output of the main agricultural commodity, gross domestic product, decision table, information system, data mining, decision rule
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17109944 Optimization of Heat Treatment Due to Austenising Temperature, Time and Quenching Solution in Hadfield Steels
Authors: Sh. Hosseini, M. B. Limooei, M. Hossein Zade, E. Askarnia, Z. Asadi
Abstract:
Manganese steel (Hadfield) is one of the important alloys in industry due to its special properties. High work hardening ability with appropriate toughness and ductility are the properties that caused this alloy to be used in wear resistance parts and in high strength condition. Heat treatment is the main process through which the desired mechanical properties and microstructures are obtained in Hadfield steel. In this study various heat treatment cycles, differing in austenising temperature, time and quenching solution are applied. For this purpose, the same samples of manganese steel was heat treated in 9 different cycles, and then the mechanical properties and microstructures were investigated. Based on the results of the study, the optimum heat treatment cycle was obtained.
Keywords: Manganese steel (Hadfield), heat treatment, austenising temperature, austenising time, quenching solution, mechanical properties.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 44949943 Two-Dimensional Solitary Wave Solution to the Quadratic Nonlinear Schrdinger Equation
Authors: Sarun Phibanchon
Abstract:
The solitary wave solution of the quadratic nonlinear Schrdinger equation is determined by the iterative method called Petviashvili method. This solution is also used for the initial condition for the time evolution to study the stability analysis. The spectral method is applied for the time evolution.
Keywords: soliton, iterative method, spectral method, plasma
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18629942 Water End-Use Classification with Contemporaneous Water-Energy Data and Deep Learning Network
Authors: Khoi A. Nguyen, Rodney A. Stewart, Hong Zhang
Abstract:
‘Water-related energy’ is energy use which is directly or indirectly influenced by changes to water use. Informatics applying a range of mathematical, statistical and rule-based approaches can be used to reveal important information on demand from the available data provided at second, minute or hourly intervals. This study aims to combine these two concepts to improve the current water end use disaggregation problem through applying a wide range of most advanced pattern recognition techniques to analyse the concurrent high-resolution water-energy consumption data. The obtained results have shown that recognition accuracies of all end-uses have significantly increased, especially for mechanised categories, including clothes washer, dishwasher and evaporative air cooler where over 95% of events were correctly classified.
Keywords: Deep learning network, smart metering, water end use, water-energy data.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13639941 Decontamination of Chromium Containing Ground Water by Adsorption Using Chemically Modified Activated Carbon Fabric
Authors: J. R. Mudakavi, K. Puttanna
Abstract:
Chromium in the environment is considered as one of the most toxic elements probably next only to mercury and arsenic. It is acutely toxic, mutagenic and carcinogenic in the environment. Chromium contamination of soil and underground water due to industrial activities is a very serious problem in several parts of India covering Karnataka, Tamil Nadu, Andhra Pradesh etc. Functionally modified Activated Carbon Fabrics (ACF) offer targeted chromium removal from drinking water and industrial effluents. Activated carbon fabric is a light weight adsorbing material with high surface area and low resistance to fluid flow. We have investigated surface modification of ACF using various acids in the laboratory through batch as well as through continuous flow column experiments with a view to develop the optimum conditions for chromium removal. Among the various acids investigated, phosphoric acid modified ACF gave best results with a removal efficiency of 95% under optimum conditions. Optimum pH was around 2 – 4 with 2 hours contact time. Continuous column experiments with an effective bed contact time (EBCT) of 5 minutes indicated that breakthrough occurred after 300 bed volumes. Adsorption data followed a Freundlich isotherm pattern. Nickel adsorbs preferentially and sulphate reduces chromium adsorption by 50%. The ACF could be regenerated up to 52.3% using 3 M NaOH under optimal conditions. The process is simple, economical, energy efficient and applicable to industrial effluents and drinking water.
Keywords: Activated carbon fabric, adsorption, drinking water, hexavalent chromium.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10459940 Concurrent Access to Complex Entities
Authors: Cosmin Rablou
Abstract:
In this paper we present a way of controlling the concurrent access to data in a distributed application using the Pessimistic Offline Lock design pattern. In our case, the application processes a complex entity, which contains in a hierarchical structure different other entities (objects). It will be shown how the complex entity and the contained entities must be locked in order to control the concurrent access to data.Keywords: Object-oriented programming, Pessimistic Lock, Design pattern, Concurrent access to data, Processing complex entities
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13119939 Sparse Coding Based Classification of Electrocardiography Signals Using Data-Driven Complete Dictionary Learning
Authors: Fuad Noman, Sh-Hussain Salleh, Chee-Ming Ting, Hadri Hussain, Syed Rasul
Abstract:
In this paper, a data-driven dictionary approach is proposed for the automatic detection and classification of cardiovascular abnormalities. Electrocardiography (ECG) signal is represented by the trained complete dictionaries that contain prototypes or atoms to avoid the limitations of pre-defined dictionaries. The data-driven trained dictionaries simply take the ECG signal as input rather than extracting features to study the set of parameters that yield the most descriptive dictionary. The approach inherently learns the complicated morphological changes in ECG waveform, which is then used to improve the classification. The classification performance was evaluated with ECG data under two different preprocessing environments. In the first category, QT-database is baseline drift corrected with notch filter and it filters the 60 Hz power line noise. In the second category, the data are further filtered using fast moving average smoother. The experimental results on QT database confirm that our proposed algorithm shows a classification accuracy of 92%.Keywords: Electrocardiogram, dictionary learning, sparse coding, classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20959938 A Remote Sensing Approach to Calculate Population Using Roads Network Data in Lebanon
Authors: Kamel Allaw, Jocelyne Adjizian Gerard, Makram Chehayeb, Nada Badaro Saliba
Abstract:
In developing countries, such as Lebanon, the demographic data are hardly available due to the absence of the mechanization of population system. The aim of this study is to evaluate, using only remote sensing data, the correlations between the number of population and the characteristics of roads network (length of primary roads, length of secondary roads, total length of roads, density and percentage of roads and the number of intersections). In order to find the influence of the different factors on the demographic data, we studied the degree of correlation between each factor and the number of population. The results of this study have shown a strong correlation between the number of population and the density of roads and the number of intersections.
Keywords: Population, road network, statistical correlations, remote sensing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9969937 Risk-Management by Numerical Pattern Analysis in Data-Mining
Authors: M. Kargar, R. Mirmiran, F. Fartash, T. Saderi
Abstract:
In this paper a new method is suggested for risk management by the numerical patterns in data-mining. These patterns are designed using probability rules in decision trees and are cared to be valid, novel, useful and understandable. Considering a set of functions, the system reaches to a good pattern or better objectives. The patterns are analyzed through the produced matrices and some results are pointed out. By using the suggested method the direction of the functionality route in the systems can be controlled and best planning for special objectives be done.Keywords: Analysis, Data-mining, Pattern, Risk Management.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12719936 Wind Speed Data Analysis using Wavelet Transform
Authors: S. Avdakovic, A. Lukac, A. Nuhanovic, M. Music
Abstract:
Renewable energy systems are becoming a topic of great interest and investment in the world. In recent years wind power generation has experienced a very fast development in the whole world. For planning and successful implementations of good wind power plant projects, wind potential measurements are required. In these projects, of great importance is the effective choice of the micro location for wind potential measurements, installation of the measurement station with the appropriate measuring equipment, its maintenance and analysis of the gained data on wind potential characteristics. In this paper, a wavelet transform has been applied to analyze the wind speed data in the context of insight in the characteristics of the wind and the selection of suitable locations that could be the subject of a wind farm construction. This approach shows that it can be a useful tool in investigation of wind potential.Keywords: Wind potential, Wind speed data, Wavelettransform.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26329935 Research on Morning Commuting Behavior under Autonomous Vehicle Environment Based on Activity Method
Authors: Qing Dai, Zhengkui Lin, Jiajia Zhang, Yi Qu
Abstract:
Based on activity method, this paper focuses on morning commuting behavior when commuters travel with autonomous vehicles (AVs). Firstly, a net utility function of commuters is constructed by the activity utility of commuters at home, in car and at workplace, and the disutility of travel time cost and that of schedule delay cost. Then, this net utility function is applied to build an equilibrium model. Finally, under the assumption of constant marginal activity utility, the properties of equilibrium are analyzed. The results show that, in autonomous driving, the starting and ending time of morning peak and the number of commuters who arrive early and late at workplace are the same as those in manual driving. In automatic driving, however, the departure rate of arriving early at workplace is higher than that of manual driving, while the departure rate of arriving late is just the opposite. In addition, compared with manual driving, the departure time of arriving at workplace on time is earlier and the number of people queuing at the bottleneck is larger in automatic driving. However, the net utility of commuters and the total net utility of system in automatic driving are greater than those in manual driving.
Keywords: Autonomous cars, bottleneck model, activity utility, user equilibrium.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6099934 A Biometric Template Security Approach to Fingerprints Based on Polynomial Transformations
Authors: Ramon Santana
Abstract:
The use of biometric identifiers in the field of information security, access control to resources, authentication in ATMs and banking among others, are of great concern because of the safety of biometric data. In the general architecture of a biometric system have been detected eight vulnerabilities, six of them allow obtaining minutiae template in plain text. The main consequence of obtaining minutia templates is the loss of biometric identifier for life. To mitigate these vulnerabilities several models to protect minutiae templates have been proposed. Several vulnerabilities in the cryptographic security of these models allow to obtain biometric data in plain text. In order to increase the cryptographic security and ease of reversibility, a minutiae templates protection model is proposed. The model aims to make the cryptographic protection and facilitate the reversibility of data using two levels of security. The first level of security is the data transformation level. In this level generates invariant data to rotation and translation, further transformation is irreversible. The second level of security is the evaluation level, where the encryption key is generated and data is evaluated using a defined evaluation function. The model is aimed at mitigating known vulnerabilities of the proposed models, basing its security on the impossibility of the polynomial reconstruction.Keywords: Fingerprint, template protection, bio-cryptography, minutiae protection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8429933 Mnemotopic Perspectives: Communication Design as Stabilizer for the Memory of Places
Authors: C. Galasso
Abstract:
The ancestral relationship between humans and geographical environment has long been at the center of an interdisciplinary dialogue, which sees one of its main research nodes in the relationship between memory and places. Given its deep complexity, this symbiotic connection continues to look for a proper definition that appears increasingly negotiated by different disciplines. Numerous fields of knowledge are involved, from anthropology to semiotics of space, from photography to architecture, up to subjects traditionally far from these reasonings. This is the case of Design of Communication, a young discipline, now confident in itself and its objectives, aimed at finding and investigating original forms of visualization and representation, between sedimented knowledge and new technologies. In particular, Design of Communication for the Territory offers an alternative perspective to the debate, encouraging the reactivation and reconstruction of the memory of places. Recognizing mnemotopes as a cultural object of vertical interpretation of the memory-place relationship, design can become a real mediator of the territorial fixation of memories, making them increasingly accessible and perceptible, contributing to build a topography of memory. According to a mnemotopic vision, Communication Design can support the passage from a memory in which the observer participates only as an individual to a collective form of memory. A mnemotopic form of Communication Design can, through geolocation and content map-based systems, make chronology a topography rooted in the territory and practicable; it can be useful to understand how the perception of the memory of places changes over time, considering how to insert them in the contemporary world. Mnemotopes can be materialized in different format of translation, editing and narration and then involved in complex systems of communication. The memory of places, therefore, if stabilized by the tools offered by Communication Design, can make visible ruins and territorial stratifications, illuminating them with new communicative interests that can be shared and participated.
Keywords: Memory of places, design of communication, territory, mnemotope, topography of memory.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8229932 Treatment of Spin-1/2 Particle in Interaction with a Time-Dependent Magnetic Field by the Fermionic Coherent-State Path-Integral Formalism
Authors: Aouachria Mekki
Abstract:
We consider a spin-1/2 particle interacting with a time-dependent magnetic field using path integral formalism. The propagator is first of all written in the standard form replacing the spin by two fermionic oscillators via the Schwinger model. The propagator is then exactly determined, thanks to a simple transformation, and the transition probability is deduced.
Keywords: Path integral, formalism, Propagator.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24639931 Global Existence of Periodic Solutions in a Delayed Tri–neuron Network
Authors: Kejun Zhuang, Zhaohui Wen
Abstract:
In this paper, a tri–neuron network model with time delay is investigated. By using the Bendixson-s criterion for high– dimensional ordinary differential equations and global Hopf bifurcation theory for functional differential equations, sufficient conditions for existence of periodic solutions when the time delay is sufficiently large are established.Keywords: Delay, global Hopf bifurcation, neural network, periodicsolutions.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14849930 Study of Explicit Finite Difference Method in One Dimensional System
Authors: Azizollah Khormali, Seyyed Shahab Tabatabaee Moradi, Dmitry Petrakov
Abstract:
One of the most important parameters in petroleum reservoirs is the pressure distribution along the reservoir, as the pressure varies with the time and location. A popular method to determine the pressure distribution in a reservoir in the unsteady state regime of flow is applying Darcy’s equation and solving this equation numerically. The numerical simulation of reservoirs is based on these numerical solutions of different partial differential equations (PDEs) representing the multiphase flow of fluids. Pressure profile has obtained in a one dimensional system solving Darcy’s equation explicitly. Changes of pressure profile in three situations are investigated in this work. These situations include section length changes, step time changes and time approach to infinity. The effects of these changes in pressure profile are shown and discussed in the paper.
Keywords: Explicit solution, Numerical simulation, Petroleum reservoir, Pressure distribution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 42039929 Synthesis of Filtering in Stochastic Systems on Continuous-Time Memory Observations in the Presence of Anomalous Noises
Authors: S. Rozhkova, O. Rozhkova, A. Harlova, V. Lasukov
Abstract:
We have conducted the optimal synthesis of rootmean- squared objective filter to estimate the state vector in the case if within the observation channel with memory the anomalous noises with unknown mathematical expectation are complement in the function of the regular noises. The synthesis has been carried out for linear stochastic systems of continuous - time.
Keywords: Mathematical expectation, filtration, anomalous noise, memory.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19699928 SIMGraph: Simplifying Contig Graph to Improve de Novo Genome Assembly Using Next-generation Sequencing Data
Authors: Chien-Ju Li, Chun-Hui Yu, Chi-Chuan Hwang, Tsunglin Liu , Darby Tien-Hao Chang
Abstract:
De novo genome assembly is always fragmented. Assembly fragmentation is more serious using the popular next generation sequencing (NGS) data because NGS sequences are shorter than the traditional Sanger sequences. As the data throughput of NGS is high, the fragmentations in assemblies are usually not the result of missing data. On the contrary, the assembled sequences, called contigs, are often connected to more than one other contigs in a complicated manner, leading to the fragmentations. False connections in such complicated connections between contigs, named a contig graph, are inevitable because of repeats and sequencing/assembly errors. Simplifying a contig graph by removing false connections directly improves genome assembly. In this work, we have developed a tool, SIMGraph, to resolve ambiguous connections between contigs using NGS data. Applying SIMGraph to the assembly of a fungus and a fish genome, we resolved 27.6% and 60.3% ambiguous contig connections, respectively. These results can reduce the experimental efforts in resolving contig connections.
Keywords: Contig graph, NGS, de novo assembly, scaffold.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17349927 Assessing Applicability of Kevin Lynch’s Framework of The Image of the City in the Case of the Walled City of Jaipur
Authors: Jay Patel
Abstract:
This research is about investigating the ‘image’ of the city, and asks whether this ‘image’ holds any significance that can be changed. Kevin Lynch in the book ‘The Image of the City’ develops a framework that breaks down the city’s image into five physical elements. These elements (Paths, Edge, Nodes, Districts, and Landmarks), according to Lynch assess the legibility of the urbanscapes, that emerged from his perception-based study in three different cities (New Jersey, Los Angeles, and Boston) in the USA. The aim of this research is to investigate whether Lynch’s framework can be applied within an Indian context or not. If so, what are the possibilities and whether the imageability of Indian cities can be depicted through the Lynch’s physical elements or it demands an extension to the framework by either adding or subtracting a physical attribute. For this research project, the walled city of Jaipur was selected, as it is considered one of the futuristic designed cities of all time in India. The other significant reason for choosing Jaipur was that it is a historically planned city with solid historical, touristic and local importance; allowing an opportunity to understand the application of Lynch's elements to the city's image. In other words, it provides an opportunity to examine how the disadvantages of a city's implicit program (its relics of bygone eras) can be converted into assets by improving the imageability of the city. To obtain data, a structured semi-open ended interview method was chosen. The reason for selecting this method explicitly was to gain qualitative data from the users rather than collecting quantitative data from closed-ended questions. This allowed in-depth understanding and applicability of Kevin Lynch’s framework while assessing what needs to be added. The interviews were conducted in Jaipur that yielded varied inferences that were different from the expected learning outcomes, highlighting the need for extension on Lynch’s physical elements to achieve city’s image. Whilst analyzing the data, there were few attributes found that defined the image of Jaipur. These were categorized into two: a Physical aspect (streets and arcade entities, natural features, temples and temporary/informal activities) and Associational aspects (History, culture and tradition, medium of help in wayfinding, and intangible aspects).
Keywords: Imageability, Kevin Lynch, People’s Perception, associational aspects, physical aspects.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4529926 Acute Coronary Syndrome Prediction Using Data Mining Techniques- An Application
Authors: Tahseen A. Jilani, Huda Yasin, Madiha Yasin, C. Ardil
Abstract:
In this paper we use data mining techniques to investigate factors that contribute significantly to enhancing the risk of acute coronary syndrome. We assume that the dependent variable is diagnosis – with dichotomous values showing presence or absence of disease. We have applied binary regression to the factors affecting the dependent variable. The data set has been taken from two different cardiac hospitals of Karachi, Pakistan. We have total sixteen variables out of which one is assumed dependent and other 15 are independent variables. For better performance of the regression model in predicting acute coronary syndrome, data reduction techniques like principle component analysis is applied. Based on results of data reduction, we have considered only 14 out of sixteen factors.
Keywords: Acute coronary syndrome (ACS), binary logistic regression analyses, myocardial ischemia (MI), principle component analysis, unstable angina (U.A.).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21149925 Study of Temperature Changes in Fars Province
Authors: A. Gandomkar, R. Dehghani
Abstract:
Climate change is a phenomenon has been based on the available evidence from a very long time ago and now its existence is very probable. The speed and nature of climate parameters changes at the middle of twentieth century has been different and its quickness more than the before and its trend changed to some extent comparing to the past. Climate change issue now regarded as not only one of the most common scientific topic but also a social political one, is not a new issue. Climate change is a complicated atmospheric oceanic phenomenon on a global scale and long-term. Precipitation pattern change, fast decrease of snowcovered resources and its rapid melting, increased evaporation, the occurrence of destroying floods, water shortage crisis, severe reduction at the rate of harvesting agricultural products and, so on are all the significant of climate change. To cope with this phenomenon, its consequences and events in which public instruction is the most important but it may be climate that no significant cant and effective action has been done so far. The present article is included a part of one surrey about climate change in Fars. The study area having annually mean temperature 14 and precipitation 320 mm .23 stations inside the basin with a common 37 year statistical period have been applied to the meteorology data (1974-2010). Man-kendal and change factor methods are two statistical methods, applying them, the trend of changes and the annual mean average temperature and the annual minimum mean temperature were studied by using them. Based on time series for each parameter, the annual mean average temperature and the mean of annual maximum temperature have a rising trend so that this trend is clearer to the mean of annual maximum temperature.Keywords: Climate change, Coefficient Variation, Fars province, Man-Kendal method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19159924 Preparation and Properties of Biopolymer from L-Lactide (LL) and ε-Caprolactone (CL)
Authors: A. Buasri, N. Chaiyut, K. Iamma, K. Kongcharoen, K. Cheunsakulpong
Abstract:
Biopolymers have gained much attention as ecofriendly alternatives to petrochemical-based plastics because they are biodegradable and can be produced from renewable feedstocks. One class of biopolyester with many potential environmentally friendly applications is polylactic acid (PLA) and polycaprolactone (PCL). The PLA/PCL biodegradable copolyesters were synthesized by bulk ring-opening copolymerization of successively added Llactide (LL) and ε-caprolactone (CL) in the presence of toluene, using 1-hexanol as initiator and stannous octoate (Sn(Oct)2) as catalyst. Reaction temperature, reaction time and amount of catalyst were evaluated to obtain optimum reaction conditions. The results showed that the %conversion increased with increases in reaction temperature and reaction time, but after a critical amount of catalyst was reached the %conversion decreased. The yield of PLA/PCL biopolymer achieved 98.02% at the reaction temperature 160 °C, amount of catalyst 0.3 mol% and reaction time of 48 h. In addition, the thermal properties of the product were determined by differential scanning calorimetry (DSC) and thermogravimetric analysis (TGA).
Keywords: Biopolymer, Polylactic Acid (PLA), Polycaprolactone (PCL), L-Lactide (LL), ε-Caprolactone (CL)
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 45009923 Extraction of Natural Colorant from the Flowers of Flame of Forest Using Ultrasound
Authors: Sunny Arora, Meghal A. Desai
Abstract:
An impetus towards green consumerism and implementation of sustainable techniques, consumption of natural products and utilization of environment friendly techniques have gained accelerated acceptance. Butein, a natural colorant, has many medicinal properties apart from its use in dyeing industries. Extraction of butein from the flowers of flame of forest was carried out using ultrasonication bath. Solid loading (2-6 g), extraction time (30-50 min), volume of solvent (30-50 mL) and types of solvent (methanol, ethanol and water) have been studied to maximize the yield of butein using the Taguchi method. The highest yield of butein 4.67% (w/w) was obtained using 4 g of plant material, 40 min of extraction time and 30 mL volume of methanol as a solvent. The present method provided a greater reduction in extraction time compared to the conventional method of extraction. Hence, the outcome of the present investigation could further be utilized to develop the method at a higher scale.
Keywords: Butein, flowers of flame of forest, Taguchi method, ultrasonic bath.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9579922 An Index based Forward Backward Multiple Pattern Matching Algorithm
Authors: Raju Bhukya, DVLN Somayajulu
Abstract:
Pattern matching is one of the fundamental applications in molecular biology. Searching DNA related data is a common activity for molecular biologists. In this paper we explore the applicability of a new pattern matching technique called Index based Forward Backward Multiple Pattern Matching algorithm(IFBMPM), for DNA Sequences. Our approach avoids unnecessary comparisons in the DNA Sequence due to this; the number of comparisons of the proposed algorithm is very less compared to other existing popular methods. The number of comparisons rapidly decreases and execution time decreases accordingly and shows better performance.
Keywords: Comparisons, DNA Sequence, Index.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23759921 Load Discontinuity in Shock Response and Its Remedies
Authors: Shuenn-Yih Chang, Chiu-Li Huang
Abstract:
It has been shown that a load discontinuity at the end of an impulse will result in an extra impulse and hence an extra amplitude distortion if a step-by-step integration method is employed to yield the shock response. In order to overcome this difficulty, three remedies are proposed to reduce the extra amplitude distortion. The first remedy is to solve the momentum equation of motion instead of the force equation of motion in the step-by-step solution of the shock response, where an external momentum is used in the solution of the momentum equation of motion. Since the external momentum is a resultant of the time integration of external force, the problem of load discontinuity will automatically disappear. The second remedy is to perform a single small time step immediately upon termination of the applied impulse while the other time steps can still be conducted by using the time step determined from general considerations. This is because that the extra impulse caused by a load discontinuity at the end of an impulse is almost linearly proportional to the step size. Finally, the third remedy is to use the average value of the two different values at the integration point of the load discontinuity to replace the use of one of them for loading input. The basic motivation of this remedy originates from the concept of no loading input error associated with the integration point of load discontinuity. The feasibility of the three remedies are analytically explained and numerically illustrated.Keywords: Dynamic analysis, load discontinuity, shock response, step-by-step integration
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13329920 Problem Solving in Chilean Higher Education: Figurations Prior in Interpretations of Cartesian Graphs
Authors: Verónica Díaz
Abstract:
A Cartesian graph, as a mathematical object, becomes a tool for configuration of change. Its best comprehension is done through everyday life problem-solving associated with its representation. Despite this, the current educational framework favors general graphs, without consideration of their argumentation. Students are required to find the mathematical function without associating it to the development of graphical language. This research describes the use made by students of configurations made prior to Cartesian graphs with regards to an everyday life problem related to a time and distance variation phenomenon. The theoretical framework describes the function conditions of study and their modeling. This is a qualitative, descriptive study involving six undergraduate case studies that were carried out during the first term in 2016 at University of Los Lagos. The research problem concerned the graphic modeling of a real person’s movement phenomenon, and two levels of analysis were identified. The first level aims to identify local and global graph interpretations; a second level describes the iconicity and referentiality degree of an image. According to the results, students were able to draw no figures before the Cartesian graph, highlighting the need for students to represent the context and the movement of which causes the phenomenon change. From this, they managed Cartesian graphs representing changes in position, therefore, achieved an overall view of the graph. However, the local view only indicates specific events in the problem situation, using graphic and verbal expressions to represent movement. This view does not enable us to identify what happens on the graph when the movement characteristics change based on possible paths in the person’s walking speed.
Keywords: Cartesian graphs, higher education, movement modeling, problem solving.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11779919 Design of a Hand-Held, Clamp-on, Leakage Current Sensor for High Voltage Direct Current Insulators
Authors: Morné Roman, Robert van Zyl, Nishanth Parus, Nishal Mahatho
Abstract:
Leakage current monitoring for high voltage transmission line insulators is of interest as a performance indicator. Presently, to the best of our knowledge, there is no commercially available, clamp-on type, non-intrusive device for measuring leakage current on energised high voltage direct current (HVDC) transmission line insulators. The South African power utility, Eskom, is investigating the development of such a hand-held sensor for two important applications; first, for continuous real-time condition monitoring of HVDC line insulators and, second, for use by live line workers to determine if it is safe to work on energised insulators. In this paper, a DC leakage current sensor based on magnetic field sensing techniques is developed. The magnetic field sensor used in the prototype can also detect alternating current up to 5 MHz. The DC leakage current prototype detects the magnetic field associated with the current flowing on the surface of the insulator. Preliminary HVDC leakage current measurements are performed on glass insulators. The results show that the prototype can accurately measure leakage current in the specified current range of 1-200 mA. The influence of external fields from the HVDC line itself on the leakage current measurements is mitigated through a differential magnetometer sensing technique. Thus, the developed sensor can perform measurements on in-service HVDC insulators. The research contributes to the body of knowledge by providing a sensor to measure leakage current on energised HVDC insulators non-intrusively. This sensor can also be used by live line workers to inform them whether or not it is safe to perform maintenance on energized insulators.
Keywords: Direct current, insulator, leakage current, live line, magnetic field, sensor, transmission lines.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9129918 The Research of Fuzzy Classification Rules Applied to CRM
Authors: Chien-Hua Wang, Meng-Ying Chou, Chin-Tzong Pang
Abstract:
In the era of great competition, understanding and satisfying customers- requirements are the critical tasks for a company to make a profits. Customer relationship management (CRM) thus becomes an important business issue at present. With the help of the data mining techniques, the manager can explore and analyze from a large quantity of data to discover meaningful patterns and rules. Among all methods, well-known association rule is most commonly seen. This paper is based on Apriori algorithm and uses genetic algorithms combining a data mining method to discover fuzzy classification rules. The mined results can be applied in CRM to help decision marker make correct business decisions for marketing strategies.Keywords: Customer relationship management (CRM), Data mining, Apriori algorithm, Genetic algorithm, Fuzzy classification rules.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16619917 Equilibrium Modeling of Carbon Dioxide Adsorption on Zeolites
Authors: Alireza Behvandi, Somayeh Tourani
Abstract:
High pressure adsorption of carbon dioxide on zeolite 13X was investigated in the pressure range (0 to 4) Mpa and temperatures 298, 308 and 323K. The data fitting is accomplished with the Toth, UNILAN, Dubinin-Astakhov and virial adsorption models which are generally used for micro porous adsorbents such as zeolites. Comparison with experimental data from the literature indicated that the virial model would best determine results. These results may be partly attributed to the flexibility of the virial model which can accommodate as many constants as the data warrants.Keywords: adsorption models, zeolite, carbon dioxide
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2884