Search results for: Taylor’s Series Method
18082 Labview-Based System for Fiber Links Events Detection
Authors: Bo Liu, Qingshan Kong, Weiqing Huang
Abstract:
With the rapid development of modern communication, diagnosing the fiber-optic quality and faults in real-time is widely focused. In this paper, a Labview-based system is proposed for fiber-optic faults detection. The wavelet threshold denoising method combined with Empirical Mode Decomposition (EMD) is applied to denoise the optical time domain reflectometer (OTDR) signal. Then the method based on Gabor representation is used to detect events. Experimental measurements show that signal to noise ratio (SNR) of the OTDR signal is improved by 1.34dB on average, compared with using the wavelet threshold denosing method. The proposed system has a high score in event detection capability and accuracy. The maximum detectable fiber length of the proposed Labview-based system can be 65km.Keywords: empirical mode decomposition, events detection, Gabor transform, optical time domain reflectometer, wavelet threshold denoising
Procedia PDF Downloads 12318081 Rounded-off Measurements and Their Implication on Control Charts
Authors: Ran Etgar
Abstract:
The process of rounding off measurements in continuous variables is commonly encountered. Although it usually has minor effects, sometimes it can lead to poor outcomes in statistical process control using X ̅-chart. The traditional control limits can cause incorrect conclusions if applied carelessly. This study looks into the limitations of classical control limits, particularly the impact of asymmetry. An approach to determining the distribution function of the measured parameter (Y ̅) is presented, resulting in a more precise method to establish the upper and lower control limits. The proposed method, while slightly more complex than Shewhart's original idea, is still user-friendly and accurate and only requires the use of two straightforward tables.Keywords: inaccurate measurement, SPC, statistical process control, rounded-off, control chart
Procedia PDF Downloads 4018080 A Generative Adversarial Framework for Bounding Confounded Causal Effects
Authors: Yaowei Hu, Yongkai Wu, Lu Zhang, Xintao Wu
Abstract:
Causal inference from observational data is receiving wide applications in many fields. However, unidentifiable situations, where causal effects cannot be uniquely computed from observational data, pose critical barriers to applying causal inference to complicated real applications. In this paper, we develop a bounding method for estimating the average causal effect (ACE) under unidentifiable situations due to hidden confounders. We propose to parameterize the unknown exogenous random variables and structural equations of a causal model using neural networks and implicit generative models. Then, with an adversarial learning framework, we search the parameter space to explicitly traverse causal models that agree with the given observational distribution and find those that minimize or maximize the ACE to obtain its lower and upper bounds. The proposed method does not make any assumption about the data generating process and the type of the variables. Experiments using both synthetic and real-world datasets show the effectiveness of the method.Keywords: average causal effect, hidden confounding, bound estimation, generative adversarial learning
Procedia PDF Downloads 19118079 An Intelligent Thermal-Aware Task Scheduler in Multiprocessor System on a Chip
Authors: Sina Saadati
Abstract:
Multiprocessors Systems-On-Chips (MPSOCs) are used widely on modern computers to execute sophisticated software and applications. These systems include different processors for distinct aims. Most of the proposed task schedulers attempt to improve energy consumption. In some schedulers, the processor's temperature is considered to increase the system's reliability and performance. In this research, we have proposed a new method for thermal-aware task scheduling which is based on an artificial neural network (ANN). This method enables us to consider a variety of factors in the scheduling process. Some factors like ambient temperature, season (which is important for some embedded systems), speed of the processor, computing type of tasks and have a complex relationship with the final temperature of the system. This Issue can be solved using a machine learning algorithm. Another point is that our solution makes the system intelligent So that It can be adaptive. We have also shown that the computational complexity of the proposed method is cheap. As a consequence, It is also suitable for battery-powered systems.Keywords: task scheduling, MOSOC, artificial neural network, machine learning, architecture of computers, artificial intelligence
Procedia PDF Downloads 10318078 Energy Consumption Statistic of Gas-Solid Fluidized Beds through Computational Fluid Dynamics-Discrete Element Method Simulations
Authors: Lei Bi, Yunpeng Jiao, Chunjiang Liu, Jianhua Chen, Wei Ge
Abstract:
Two energy paths are proposed from thermodynamic viewpoints. Energy consumption means total power input to the specific system, and it can be decomposed into energy retention and energy dissipation. Energy retention is the variation of accumulated mechanical energy in the system, and energy dissipation is the energy converted to heat by irreversible processes. Based on the Computational Fluid Dynamics-Discrete Element Method (CFD-DEM) framework, different energy terms are quantified from the specific flow elements of fluid cells and particles as well as their interactions with the wall. Direct energy consumption statistics are carried out for both cold and hot flow in gas-solid fluidization systems. To clarify the statistic method, it is necessary to identify which system is studied: the particle-fluid system or the particle sub-system. For the cold flow, the total energy consumption of the particle sub-system can predict the onset of bubbling and turbulent fluidization, while the trends of local energy consumption can reflect the dynamic evolution of mesoscale structures. For the hot flow, different heat transfer mechanisms are analyzed, and the original solver is modified to reproduce the experimental results. The influence of the heat transfer mechanisms and heat source on energy consumption is also investigated. The proposed statistic method has proven to be energy-conservative and easy to conduct, and it is hopeful to be applied to other multiphase flow systems.Keywords: energy consumption statistic, gas-solid fluidization, CFD-DEM, regime transition, heat transfer mechanism
Procedia PDF Downloads 6818077 Oil Recovery Study by Low Temperature Carbon Dioxide Injection in High-Pressure High-Temperature Micromodels
Authors: Zakaria Hamdi, Mariyamni Awang
Abstract:
For the past decades, CO2 flooding has been used as a successful method for enhanced oil recovery (EOR). However, high mobility ratio and fingering effect are considered as important drawbacka of this process. Low temperature injection of CO2 into high temperature reservoirs may improve the oil recovery, but simulating multiphase flow in the non-isothermal medium is difficult, and commercial simulators are very unstable in these conditions. Furthermore, to best of authors’ knowledge, no experimental work was done to verify the results of the simulations and to understand the pore-scale process. In this paper, we present results of investigations on injection of low temperature CO2 into a high-pressure high-temperature micromodel with injection temperature range from 34 to 75 °F. Effect of temperature and saturation changes of different fluids are measured in each case. The results prove the proposed method. The injection of CO2 at low temperatures increased the oil recovery in high temperature reservoirs significantly. Also, CO2 rich phases available in the high temperature system can affect the oil recovery through the better sweep of the oil which is initially caused by penetration of LCO2 inside the system. Furthermore, no unfavorable effect was detected using this method. Low temperature CO2 is proposed to be used as early as secondary recovery.Keywords: enhanced oil recovery, CO₂ flooding, micromodel studies, miscible flooding
Procedia PDF Downloads 35218076 Fiqh Challenge in Production of Halal Pharmaceutical Products
Authors: Saadan Man, Razidah Othmanjaludin, Madiha Baharuddin
Abstract:
Nowadays, the pharmaceutical products are produced through the mixing of active and complex ingredient, naturally or synthetically; and involve extensive use of prohibited animal products. This article studies the challenges faced from fiqh perspective in the production of halal pharmaceutical products which frequently contain impure elements or prohibited animal derivatives according to Islamic law. This study is qualitative which adopts library research as well as field research by conducting series of interviews with the several related parties. The gathered data is analyzed from Sharia perspective by using some instruments especially the principle of Maqasid of Sharia. This study shows that the halal status of pharmaceutical products depends on the three basic elements: the sources of the basic ingredient; the processes involved in three phases of production, i.e., before, during and after; and the possible effects of the products. Various fiqh challenges need to be traversed in producing halal pharmaceutical products including the sources of the ingredients, the logistic process, the tools used, and the procedures of productions. Thus, the whole supply chain of production of pharmaceutical products must be well managed in accordance to the halal standard.Keywords: fiqh, halal pharmaceutical, pharmaceutical products, Malaysia
Procedia PDF Downloads 19218075 A Semi-supervised Classification Approach for Trend Following Investment Strategy
Authors: Rodrigo Arnaldo Scarpel
Abstract:
Trend following is a widely accepted investment strategy that adopts a rule-based trading mechanism that rather than striving to predict market direction or on information gathering to decide when to buy and when to sell a stock. Thus, in trend following one must respond to market’s movements that has recently happen and what is currently happening, rather than on what will happen. Optimally, in trend following strategy, is to catch a bull market at its early stage, ride the trend, and liquidate the position at the first evidence of the subsequent bear market. For applying the trend following strategy one needs to find the trend and identify trade signals. In order to avoid false signals, i.e., identify fluctuations of short, mid and long terms and to separate noise from real changes in the trend, most academic works rely on moving averages and other technical analysis indicators, such as the moving average convergence divergence (MACD) and the relative strength index (RSI) to uncover intelligible stock trading rules following trend following strategy philosophy. Recently, some works has applied machine learning techniques for trade rules discovery. In those works, the process of rule construction is based on evolutionary learning which aims to adapt the rules to the current environment and searches for the global optimum rules in the search space. In this work, instead of focusing on the usage of machine learning techniques for creating trading rules, a time series trend classification employing a semi-supervised approach was used to early identify both the beginning and the end of upward and downward trends. Such classification model can be employed to identify trade signals and the decision-making procedure is that if an up-trend (down-trend) is identified, a buy (sell) signal is generated. Semi-supervised learning is used for model training when only part of the data is labeled and Semi-supervised classification aims to train a classifier from both the labeled and unlabeled data, such that it is better than the supervised classifier trained only on the labeled data. For illustrating the proposed approach, it was employed daily trade information, including the open, high, low and closing values and volume from January 1, 2000 to December 31, 2022, of the São Paulo Exchange Composite index (IBOVESPA). Through this time period it was visually identified consistent changes in price, upwards or downwards, for assigning labels and leaving the rest of the days (when there is not a consistent change in price) unlabeled. For training the classification model, a pseudo-label semi-supervised learning strategy was used employing different technical analysis indicators. In this learning strategy, the core is to use unlabeled data to generate a pseudo-label for supervised training. For evaluating the achieved results, it was considered the annualized return and excess return, the Sortino and the Sharpe indicators. Through the evaluated time period, the obtained results were very consistent and can be considered promising for generating the intended trading signals.Keywords: evolutionary learning, semi-supervised classification, time series data, trading signals generation
Procedia PDF Downloads 8918074 The Analysis on the Renewal Strategy of Public Space in Old Communities with an Example of GeDa Community in Xi'An
Authors: Xiyue Wen
Abstract:
With the rapid development of the city, old communities in the city are facing a series of problems. On one hand, aging facilities, obsolete spatial patterns, aging populations arouse in the aging of the community. On the other hand, public space is reduced and is taking up by cars parking or facilities setting, which lead to the collapse of traditional life in the old communities. That is to say, modern amenities haven’t helped to reform the old community, but have leading to tedious and inefficient, when it is not accommodated in the traditional space. Exploring a way is imminent to the east the contradiction between modern living facilities and spatial patterns of traditional. We select a typical site-GeDa Community in Xi’an, built in 70-80s,and carry out a concept calling 'Raising Landscape', which enables a convenient and efficient space for parking, as well as a high-quality yard for activities. In addition, the design implements low cost, simple construction, resident participation, so that it can be spread in the same texture of urban space.Keywords: old communities, renewal strategy, raising landscape, public space, parking space
Procedia PDF Downloads 48018073 Geoecological Problems of Karst Waters in Chiatura Municipality, Georgia
Authors: Liana Khandolishvili, Giorgi Dvalashvili
Abstract:
Karst waters in the world play an important role in the water supply. Among them, the Vaucluse in Chiatura municipality (Georgia) is used as drinking water and is irreplaceable for the local population. Accordingly, it is important to assess their geo-ecological conditions and take care to maintain sustainability. The aim of the paper is to identify the hazards of pollution of underground waters in the karst environment and to develop a scheme for their protection, which will take into consideration both the hydrogeological characteristics and the role of humans. To achieve this goal, the EPIK method was selected using which an epikarst zone of the study area was studied in detail, as well as the protective cover, infiltration conditions and frequency of karst network development, after which the conditions of karst waters in Chiatura municipality was assessed, their main pollutants were identified and the recommendations were prepared for their protection. The results of the study showed that the karst water pollution rate in Chiatura municipality is highest, where karst-fissured layers are represented and intensive extraction works are underway. The EPIK method is innovative in Georgia and was first introduced on the example of karst waters of Chiatura municipality.Keywords: cave, EPIK method, pollution, Karst waters, geology, geography, ecology
Procedia PDF Downloads 9318072 VISMA: A Method for System Analysis in Early Lifecycle Phases
Authors: Walter Sebron, Hans Tschürtz, Peter Krebs
Abstract:
The choice of applicable analysis methods in safety or systems engineering depends on the depth of knowledge about a system, and on the respective lifecycle phase. However, the analysis method chain still shows gaps as it should support system analysis during the lifecycle of a system from a rough concept in pre-project phase until end-of-life. This paper’s goal is to discuss an analysis method, the VISSE Shell Model Analysis (VISMA) method, which aims at closing the gap in the early system lifecycle phases, like the conceptual or pre-project phase, or the project start phase. It was originally developed to aid in the definition of the system boundary of electronic system parts, like e.g. a control unit for a pump motor. Furthermore, it can be also applied to non-electronic system parts. The VISMA method is a graphical sketch-like method that stratifies a system and its parts in inner and outer shells, like the layers of an onion. It analyses a system in a two-step approach, from the innermost to the outermost components followed by the reverse direction. To ensure a complete view of a system and its environment, the VISMA should be performed by (multifunctional) development teams. To introduce the method, a set of rules and guidelines has been defined in order to enable a proper shell build-up. In the first step, the innermost system, named system under consideration (SUC), is selected, which is the focus of the subsequent analysis. Then, its directly adjacent components, responsible for providing input to and receiving output from the SUC, are identified. These components are the content of the first shell around the SUC. Next, the input and output components to the components in the first shell are identified and form the second shell around the first one. Continuing this way, shell by shell is added with its respective parts until the border of the complete system (external border) is reached. Last, two external shells are added to complete the system view, the environment and the use case shell. This system view is also stored for future use. In the second step, the shells are examined in the reverse direction (outside to inside) in order to remove superfluous components or subsystems. Input chains to the SUC, as well as output chains from the SUC are described graphically via arrows, to highlight functional chains through the system. As a result, this method offers a clear and graphical description and overview of a system, its main parts and environment; however, the focus still remains on a specific SUC. It helps to identify the interfaces and interfacing components of the SUC, as well as important external interfaces of the overall system. It supports the identification of the first internal and external hazard causes and causal chains. Additionally, the method promotes a holistic picture and cross-functional understanding of a system, its contributing parts, internal relationships and possible dangers within a multidisciplinary development team.Keywords: analysis methods, functional safety, hazard identification, system and safety engineering, system boundary definition, system safety
Procedia PDF Downloads 22518071 The Influence of Covariance Hankel Matrix Dimension on Algorithms for VARMA Models
Authors: Celina Pestano-Gabino, Concepcion Gonzalez-Concepcion, M. Candelaria Gil-Fariña
Abstract:
Some estimation methods for VARMA models, and Multivariate Time Series Models in general, rely on the use of a Hankel matrix. It is known that if the data sample is populous enough and the dimension of the Hankel matrix is unnecessarily large, this may result in an unnecessary number of computations as well as in numerical problems. In this sense, the aim of this paper is two-fold. First, we provide some theoretical results for these matrices which translate into a lower dimension for the matrices normally used in the algorithms. This contribution thus serves to improve those methods from a numerical and, presumably, statistical point of view. Second, we have chosen an estimation algorithm to illustrate in practice our improvements. The results we obtained in a simulation of VARMA models show that an increase in the size of the Hankel matrix beyond the theoretical bound proposed as valid does not necessarily lead to improved practical results. Therefore, for future research, we propose conducting similar studies using any of the linear system estimation methods that depend on Hankel matrices.Keywords: covariances Hankel matrices, Kronecker indices, system identification, VARMA models
Procedia PDF Downloads 24318070 Iron Recovery from Red Mud As Zero-Valent Iron Metal Powder Using Direct Electrochemical Reduction Method
Authors: Franky Michael Hamonangan Siagian, Affan Maulana, Himawan Tri Bayu Murti Petrus, Panut Mulyono, Widi Astuti
Abstract:
In this study, the feasibility of the direct electrowinning method was used to produce zero-valent iron from red mud. The bauxite residue sample came from the Tayan mine, Indonesia, which contains high hematite (Fe₂O₃). Before electrolysis, the samples were characterized by various analytical techniques (ICP-AES, SEM, XRD) to determine their chemical composition and mineralogy. The direct electrowinning method of red mud suspended in NaOH was introduced at low temperatures ranging from 30 - 110 °C. Variations of current density, red mud: NaOH ratio and temperature were carried out to determine the optimum operation of the direct electrowinning process. Cathode deposits and residues in electrochemical cells were analyzed using XRD, XRF, and SEM to determine the chemical composition and current recovery. The low-temperature electrolysis current efficiency on Redmud can reach 20% recovery at a current density of 920,945 A/m². The moderate performance of the process was investigated with red mud, which was attributed to the troublesome adsorption of red mud particles on the cathode, making the reduction far less efficient than that with hematite.Keywords: alumina, red mud, electrochemical reduction, iron production
Procedia PDF Downloads 7918069 Sequential Covering Algorithm for Nondifferentiable Global Optimization Problem and Applications
Authors: Mohamed Rahal, Djaouida Guetta
Abstract:
In this paper, the one-dimensional unconstrained global optimization problem of continuous functions satifying a Hölder condition is considered. We extend the algorithm of sequential covering SCA for Lipschitz functions to a large class of Hölder functions. The convergence of the method is studied and the algorithm can be applied to systems of nonlinear equations. Finally, some numerical examples are presented and illustrate the efficiency of the present approach.Keywords: global optimization, Hölder functions, sequential covering method, systems of nonlinear equations
Procedia PDF Downloads 37018068 Comparison between Hardy-Cross Method and Water Software to Solve a Pipe Networking Design Problem for a Small Town
Authors: Ahmed Emad Ahmed, Zeyad Ahmed Hussein, Mohamed Salama Afifi, Ahmed Mohammed Eid
Abstract:
Water has a great importance in life. In order to deliver water from resources to the users, many procedures should be taken by the water engineers. One of the main procedures to deliver water to the community is by designing pressurizer pipe networks for water. The main aim of this work is to calculate the water demand of a small town and then design a simple water network to distribute water resources among the town with the smallest losses. Literature has been mentioned to cover the main point related to water distribution. Moreover, the methodology has introduced two approaches to solve the research problem, one by the iterative method of Hardy-cross and the other by water software Pipe Flow. The results have introduced two main designs to satisfy the same research requirements. Finally, the researchers have concluded that the use of water software provides more abilities and options for water engineers.Keywords: looping pipe networks, hardy cross networks accuracy, relative error of hardy cross method
Procedia PDF Downloads 16518067 Analysis and Re-Design Ergonomic Mineral Water Gallon Trolley
Authors: Dessy Laksyana Utami
Abstract:
Manual material handling activities often make it difficult for humans to work like this. Muscle injury due to incorrect posture.Workers need to facilitate their activities. One tool to assist their activities in the transportation of ordinary materials is a trolley. This tool is very useful because it can be used.It can bring many items without having to spend more energy to operate it. Very Comfortable used a trolley in the community. But the old design still have a complaint by worker, because lack of grip and capacity. After posture analysis with the REBA method, the value of risk need to be increased is obtained tool. Re design use Indonesian anthropometric data with the 50th percentile.Keywords: Material Handling, REBA method, postural assessment, Trolley.
Procedia PDF Downloads 13718066 The Soliton Solution of the Quadratic-Cubic Nonlinear Schrodinger Equation
Authors: Sarun Phibanchon, Yuttakarn Rattanachai
Abstract:
The quadratic-cubic nonlinear Schrodinger equation can be explained the weakly ion-acoustic waves in magnetized plasma with a slightly non-Maxwellian electron distribution by using the Madelung's fluid picture. However, the soliton solution to the quadratic-cubic nonlinear Schrodinger equation is determined by using the direct integration. By the characteristics of a soliton, the solution can be claimed that it's a soliton by considering its time evolution and their collisions between two solutions. These results are shown by applying the spectral method.Keywords: soliton, ion-acoustic waves, plasma, spectral method
Procedia PDF Downloads 41118065 A Development of Personalized Edutainment Contents through Storytelling
Authors: Min Kyeong Cha, Ju Yeon Mun, Seong Baeg Kim
Abstract:
Recently, ‘play of learning’ became important and is emphasized as a useful learning tool. Therefore, interest in edutainment contents is growing. Storytelling is considered first as a method that improves the transmission of information and learner's interest when planning edutainment contents. In this study, we designed edutainment contents in the form of an adventure game that applies the storytelling method. This content provides questions and items constituted dynamically and reorganized learning contents through analysis of test results. It allows learners to solve various questions through effective iterative learning. As a result, the learners can reach mastery learning.Keywords: storytelling, edutainment, mastery learning, computer operating principle
Procedia PDF Downloads 31718064 Energy Communities from Municipality Level to Province Level: A Comparison Using Autoregressive Integrated Moving Average Model
Authors: Amro Issam Hamed Attia Ramadan, Marco Zappatore, Pasquale Balena, Antonella Longo
Abstract:
Considering the energetic crisis that is hitting Europe, it becomes more and more necessary to change the energy policies to depend less on fossil fuels and replace them with energy from renewable sources. This has triggered the urge to use clean energy not only to satisfy energy needs and fulfill the required consumption but also to decrease the danger of climatic changes due to harmful emissions. Many countries have already started creating energetic communities based on renewable energy sources. The first step to understanding energy needs in any place is to perfectly know the consumption. In this work, we aim to estimate electricity consumption for a municipality that makes up part of a rural area located in southern Italy using forecast models that allow for the estimation of electricity consumption for the next ten years, and we then apply the same model to the province where the municipality is located and estimate the future consumption for the same period to examine whether it is possible to start from the municipality level to reach the province level when creating energy communities.Keywords: ARIMA, electricity consumption, forecasting models, time series
Procedia PDF Downloads 17418063 Cuckoo Search (CS) Optimization Algorithm for Solving Constrained Optimization
Authors: Sait Ali Uymaz, Gülay Tezel
Abstract:
This paper presents the comparison results on the performance of the Cuckoo Search (CS) algorithm for constrained optimization problems. For constraint handling, CS algorithm uses penalty method. CS algorithm is tested on thirteen well-known test problems and the results obtained are compared to Particle Swarm Optimization (PSO) algorithm. Mean, best, median and worst values were employed for the analyses of performance.Keywords: cuckoo search, particle swarm optimization, constrained optimization problems, penalty method
Procedia PDF Downloads 55818062 Epileptic Seizure Prediction by Exploiting Signal Transitions Phenomena
Authors: Mohammad Zavid Parvez, Manoranjan Paul
Abstract:
A seizure prediction method is proposed by extracting global features using phase correlation between adjacent epochs for detecting relative changes and local features using fluctuation/deviation within an epoch for determining fine changes of different EEG signals. A classifier and a regularization technique are applied for the reduction of false alarms and improvement of the overall prediction accuracy. The experiments show that the proposed method outperforms the state-of-the-art methods and provides high prediction accuracy (i.e., 97.70%) with low false alarm using EEG signals in different brain locations from a benchmark data set.Keywords: Epilepsy, seizure, phase correlation, fluctuation, deviation.
Procedia PDF Downloads 46718061 Comparison of Power Generation Status of Photovoltaic Systems under Different Weather Conditions
Authors: Zhaojun Wang, Zongdi Sun, Qinqin Cui, Xingwan Ren
Abstract:
Based on multivariate statistical analysis theory, this paper uses the principal component analysis method, Mahalanobis distance analysis method and fitting method to establish the photovoltaic health model to evaluate the health of photovoltaic panels. First of all, according to weather conditions, the photovoltaic panel variable data are classified into five categories: sunny, cloudy, rainy, foggy, overcast. The health of photovoltaic panels in these five types of weather is studied. Secondly, a scatterplot of the relationship between the amount of electricity produced by each kind of weather and other variables was plotted. It was found that the amount of electricity generated by photovoltaic panels has a significant nonlinear relationship with time. The fitting method was used to fit the relationship between the amount of weather generated and the time, and the nonlinear equation was obtained. Then, using the principal component analysis method to analyze the independent variables under five kinds of weather conditions, according to the Kaiser-Meyer-Olkin test, it was found that three types of weather such as overcast, foggy, and sunny meet the conditions for factor analysis, while cloudy and rainy weather do not satisfy the conditions for factor analysis. Therefore, through the principal component analysis method, the main components of overcast weather are temperature, AQI, and pm2.5. The main component of foggy weather is temperature, and the main components of sunny weather are temperature, AQI, and pm2.5. Cloudy and rainy weather require analysis of all of their variables, namely temperature, AQI, pm2.5, solar radiation intensity and time. Finally, taking the variable values in sunny weather as observed values, taking the main components of cloudy, foggy, overcast and rainy weather as sample data, the Mahalanobis distances between observed value and these sample values are obtained. A comparative analysis was carried out to compare the degree of deviation of the Mahalanobis distance to determine the health of the photovoltaic panels under different weather conditions. It was found that the weather conditions in which the Mahalanobis distance fluctuations ranged from small to large were: foggy, cloudy, overcast and rainy.Keywords: fitting, principal component analysis, Mahalanobis distance, SPSS, MATLAB
Procedia PDF Downloads 14418060 Improving the Design of Blood Pressure and Blood Saturation Monitors
Authors: L. Parisi
Abstract:
A blood pressure monitor or sphygmomanometer can be either manual or automatic, employing respectively either the auscultatory method or the oscillometric method. The manual version of the sphygmomanometer involves an inflatable cuff with a stethoscope adopted to detect the sounds generated by the arterial walls to measure blood pressure in an artery. An automatic sphygmomanometer can be effectively used to monitor blood pressure through a pressure sensor, which detects vibrations provoked by oscillations of the arterial walls. The pressure sensor implemented in this device improves the accuracy of the measurements taken.Keywords: blood pressure, blood saturation, sensors, actuators, design improvement
Procedia PDF Downloads 45518059 Erodibility Analysis of Cikapundung Hulu: A Study Case of Mekarwangi Catchment Area
Authors: Shantosa Yudha Siswanto, Rachmat Harryanto
Abstract:
The aim of the research was to investigate the effect of land use and slope steepness on soil erodibility index. The research was conducted from September to December 2013 in Mekarwangi catchment area, sub watershed of Cikapundung Hulu, Indonesia. The study was carried out using descriptive method. Physiographic free survey method was used as survey method, it was a survey based on land physiographic appearance. Soil sampling was carried out into transect on the similarity of slope without calculating the range between points of observation. Soil samples were carried onto three classes of land use such as: forest, plantation and dry cultivation area. Each land use consists of three slope classes such as: 8-15%, 16-25%, and 26-40% class. Five samples of soil were taken from each of them, resulting 45 points of observation. The result of the research showed that type of land use and slope classes gave different effect on soil erodibility. The highest C-organic and permeability was found on forest with slope 16-25%. Slope of 8-15% with forest land use give the lowest effect on soil erodibility.Keywords: land use, slope, erodibility, erosion
Procedia PDF Downloads 25118058 Phycoremiadation of Heavy Metals by Marine Macroalgae Collected from Olaikuda, Rameswaram, Southeast Coast of India
Authors: Suparna Roy, Anatharaman Perumal
Abstract:
The industrial effluent with high amount of heavy metals is known to have adverse effects on the environment. For the removal of heavy metals from aqueous environment, different conventional treatment technologies had been applied gradually which are not economically beneficial and also produce huge quantity of toxic chemical sludge. So, bio-sorption of heavy metals by marine plant is an eco-friendly innovative and alternative technology for removal of these pollutants from aqueous environment. The aim of this study is to evaluate the capacity of heavy metals accumulation and removal by some selected marine macroalgae (seaweeds) from marine environment. Methods: Seaweeds Acanthophora spicifera (Vahl.) Boergesen, Codium tomentosum Stackhouse, Halimeda gracilis Harvey ex. J. Agardh, Gracilaria opuntia Durairatnam.nom. inval. Valoniopsis pachynema (Martens) Boergesen, Caulerpa racemosa var. macrophysa (Sonder ex Kutzing) W. R. Taylor and Hydroclathrus clathratus (C. Agardh) Howe were collected from Olaikuda (09°17.526'N-079°19.662'E), Rameshwaram, south east coast of India during post monsoon period (April’2016). Seaweeds were washed with sterilized and filtered in-situ seawater repeatedly to remove all the epiphytes and debris and clean seaweeds were kept for shade drying for one week. The dried seaweeds were grinded to powder, and one gm powder seaweeds were taken in a 250ml conical flask, and 8 ml of 10 % HNO3 (70 % pure) was added to each sample and kept in room temperature (28 ̊C) for 24 hours and then samples were heated in hotplate at 120 ̊C, boiled to evaporate up to dryness and 20 ml of Nitric acid: Percholoric acid in 4:1 were added to it and again heated to hotplate at 90 ̊C up to evaporate to dryness, then samples were kept in room temperature for few minutes to cool and 10ml 10 % HNO3 were added to it and kept for 24 hours in cool and dark place and filtered with Whatman (589/2) filter paper and the filtrates were collected in 250ml clean conical flask and diluted accurately to 25 ml volume with double deionised water and triplicate of each sample were analysed with Inductively-Coupled plasma analysis (ICP-OES) to analyse total eleven heavy metals (Ag, Cd, B, Cu, Mn, Co, Ni, Cr, Pb, Zn, and Al content of the specified species and data were statistically evaluated for standard deviation. Results: Acanthophora spicifera contains highest amount of Ag (0.1± 0.2 mg/mg) followed by Cu (0.16±0.01 mg/mg), Mn (1.86±0.02 mg/mg), B (3.59±0.2 mg/mg), Halimeda gracilis showed highest accumulation of Al (384.75±0.12mg/mg), Valoniopsis pachynema accumulates maximum amount of Co (0.12±0.01 mg/mg), Zn (0.64±0.02 mg/mg), Caulerpa racemosa var. macrophysa contains Zn (0.63±0.01), Cr (0.26±0.01 mg/mg ), Ni (0.21±0.05), Pb (0.16±0.03 ) and Cd ( 0.02±00 ). Hydroclathrus clathratus, Codium tomentosum and Gracilaria opuntia also contain adequate amount of heavy metals. Conclusions: The mentioned species of seaweeds are contributing important role for decreasing the heavy metals pollution in marine environment by bioaccumulation. So, we can utilise this species to remove excess amount of heavy metals from polluted area.Keywords: heavy metals pollution, seaweeds, bioaccumulation, eco-friendly, phyco-remediation
Procedia PDF Downloads 23518057 A Genetic Algorithm Based Ensemble Method with Pairwise Consensus Score on Malware Cacophonous Labels
Authors: Shih-Yu Wang, Shun-Wen Hsiao
Abstract:
In the field of cybersecurity, there exists many vendors giving malware samples classified results, namely naming after the label that contains some important information which is also called AV label. Lots of researchers relay on AV labels for research. Unfortunately, AV labels are too cluttered. They do not have a fixed format and fixed naming rules because the naming results were based on each classifiers' viewpoints. A way to fix the problem is taking a majority vote. However, voting can sometimes create problems of bias. Thus, we create a novel ensemble approach which does not rely on the cacophonous naming result but depend on group identification to aggregate everyone's opinion. To achieve this purpose, we develop an scoring system called Pairwise Consensus Score (PCS) to calculate result similarity. The entire method architecture combine Genetic Algorithm and PCS to find maximum consensus in the group. Experimental results revealed that our method outperformed the majority voting by 10% in term of the score.Keywords: genetic algorithm, ensemble learning, malware family, malware labeling, AV labels
Procedia PDF Downloads 8618056 Development and Validation of a Green Analytical Method for the Analysis of Daptomycin Injectable by Fourier-Transform Infrared Spectroscopy (FTIR)
Authors: Eliane G. Tótoli, Hérida Regina N. Salgado
Abstract:
Daptomycin is an important antimicrobial agent used in clinical practice nowadays, since it is very active against some Gram-positive bacteria that are particularly challenges for the medicine, such as methicillin-resistant Staphylococcus aureus (MRSA) and vancomycin-resistant Enterococci (VRE). The importance of environmental preservation has receiving special attention since last years. Considering the evident need to protect the natural environment and the introduction of strict quality requirements regarding analytical procedures used in pharmaceutical analysis, the industries must seek environmentally friendly alternatives in relation to the analytical methods and other processes that they follow in their routine. In view of these factors, green analytical chemistry is prevalent and encouraged nowadays. In this context, infrared spectroscopy stands out. This is a method that does not use organic solvents and, although it is formally accepted for the identification of individual compounds, also allows the quantitation of substances. Considering that there are few green analytical methods described in literature for the analysis of daptomycin, the aim of this work was the development and validation of a green analytical method for the quantification of this drug in lyophilized powder for injectable solution, by Fourier-transform infrared spectroscopy (FT-IR). Method: Translucent potassium bromide pellets containing predetermined amounts of the drug were prepared and subjected to spectrophotometric analysis in the mid-infrared region. After obtaining the infrared spectrum and with the assistance of the IR Solution software, quantitative analysis was carried out in the spectral region between 1575 and 1700 cm-1, related to a carbonyl band of the daptomycin molecule, and this band had its height analyzed in terms of absorbance. The method was validated according to ICH guidelines regarding linearity, precision (repeatability and intermediate precision), accuracy and robustness. Results and discussion: The method showed to be linear (r = 0.9999), precise (RSD% < 2.0), accurate and robust, over a concentration range from 0.2 to 0.6 mg/pellet. In addition, this technique does not use organic solvents, which is one great advantage over the most common analytical methods. This fact contributes to minimize the generation of organic solvent waste by the industry and thereby reduces the impact of its activities on the environment. Conclusion: The validated method proved to be adequate to quantify daptomycin in lyophilized powder for injectable solution and can be used for its routine analysis in quality control. In addition, the proposed method is environmentally friendly, which is in line with the global trend.Keywords: daptomycin, Fourier-transform infrared spectroscopy, green analytical chemistry, quality control, spectrometry in IR region
Procedia PDF Downloads 38118055 Nonlinear Dynamic Analysis of Base-Isolated Structures Using a Mixed Integration Method: Stability Aspects and Computational Efficiency
Authors: Nicolò Vaiana, Filip C. Filippou, Giorgio Serino
Abstract:
In order to reduce numerical computations in the nonlinear dynamic analysis of seismically base-isolated structures, a Mixed Explicit-Implicit time integration Method (MEIM) has been proposed. Adopting the explicit conditionally stable central difference method to compute the nonlinear response of the base isolation system, and the implicit unconditionally stable Newmark’s constant average acceleration method to determine the superstructure linear response, the proposed MEIM, which is conditionally stable due to the use of the central difference method, allows to avoid the iterative procedure generally required by conventional monolithic solution approaches within each time step of the analysis. The main aim of this paper is to investigate the stability and computational efficiency of the MEIM when employed to perform the nonlinear time history analysis of base-isolated structures with sliding bearings. Indeed, in this case, the critical time step could become smaller than the one used to define accurately the earthquake excitation due to the very high initial stiffness values of such devices. The numerical results obtained from nonlinear dynamic analyses of a base-isolated structure with a friction pendulum bearing system, performed by using the proposed MEIM, are compared to those obtained adopting a conventional monolithic solution approach, i.e. the implicit unconditionally stable Newmark’s constant acceleration method employed in conjunction with the iterative pseudo-force procedure. According to the numerical results, in the presented numerical application, the MEIM does not have stability problems being the critical time step larger than the ground acceleration one despite of the high initial stiffness of the friction pendulum bearings. In addition, compared to the conventional monolithic solution approach, the proposed algorithm preserves its computational efficiency even when it is adopted to perform the nonlinear dynamic analysis using a smaller time step.Keywords: base isolation, computational efficiency, mixed explicit-implicit method, partitioned solution approach, stability
Procedia PDF Downloads 27818054 Sustainable Traditional Architecture and Urban Planning in Hot-Arid Climate of Iran
Authors: Farnaz Nazem
Abstract:
The aim of sustainable architecture is to design buildings with the least adverse effects on the environment and provide better conditions for people. What building forms make the best use of land? This question was addressed in the late 1960s at the center of Land Use and Built Form Studies in Cambridge. This led to a number of influential papers which had a great influence on the practice of urban design. This paper concentrates on the results of sustainability caused by climatic conditions in Iranian traditional architecture in hot-arid regions. As people spent a significant amount of their time in houses, it was very important to have such houses to fulfill their needs physically and spiritually as well as satisfying their cultural and religious aspects of their lifestyles. In a vast country such as Iran with different climatic zones, traditional builders have presented series of logical solutions for human comfort. These solutions have been able to response to the environmental problems for a long period of time. As a result, by considering the experience in traditional architecture of hot–arid climate in Iran, it is possible to attain sustainable architecture.Keywords: hot-arid climate, Iran, sustainable traditional architecture, urban planning
Procedia PDF Downloads 47218053 Characteristics of Clayey Subgrade Soil Mixed with Cement Stabilizer
Authors: Manju, Praveen Aggarwal
Abstract:
Clayey soil is considered weakest subgrade soil from civil engineering point of view under moist condition. These swelling soils attract and absorb water and losses their strength. Certain inherent properties of these clayey soils need modification for their bulk use in the construction of highways/runways pavements and embankments, etc. In this paper, results of clayey subgrade modified with cement stabilizer is presented. Investigation includes evaluation of specific gravity, Atterberg’s limits, grain size distribution, maximum dry density, optimum moisture content and CBR value of the clayey soil and cement treated clayey soil. A series of proctor compaction and CBR tests (un-soaked and soaked) are carried out on clayey soil and clayey soil mixed with cement stabilizer in 2%, 4% & 6% percentages to the dry weight of soil. In CBR test, under soaked condition best results are obtained with 6% of cement. However, the difference between the CBR value by addition of 4% and 6% cement is not much. Therefore from economical consideration addition of 4% cement gives the best result after soaking period of 90 days.Keywords: clayey soil, cement, maximum dry density, optimum moisture content, California bearing ratio
Procedia PDF Downloads 340