Search results for: maximum power point tracking
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 14474

Search results for: maximum power point tracking

8624 Seismic Performance Point of RC Frame Buildings Using ATC-40, FEMA 356 and FEMA 440 Guidelines

Authors: Gram Y. Rivas Sanchez

Abstract:

The seismic design codes in the world allow the analysis of structures considering an elastic-linear behavior; however, against earthquakes, the structures exhibit non-linear behaviors that induce damage to their elements. For this reason, it is necessary to use non-linear methods to analyze these structures, being the dynamic methods that provide more reliable results but require a lot of computational costs; on the other hand, non-linear static methods do not have this disadvantage and are being used more and more. In the present work, the nonlinear static analysis (pushover) of RC frame buildings of three, five, and seven stories is carried out considering models of concentrated plasticity using plastic hinges; and the seismic performance points are determined using ATC-40, FEMA 356, and FEMA 440 guidelines. Using this last standard, the highest inelastic displacements and basal shears are obtained, providing designs that are more conservative.

Keywords: pushover, nonlinear, RC building, FEMA 440, ATC 40

Procedia PDF Downloads 143
8623 Comparative Chromatographic Profiling of Wild and Cultivated Macrocybe Gigantea (Massee) Pegler & Lodge

Authors: Gagan Brar, Munruchi Kaur

Abstract:

Macrocybe gigantea was collected from the wild, growing as pure white, fleshy, robust fruit bodies in caespitose clusters. Initially, the few ladies collecting these fruiting bodies for cooking revealed their edibility status, which was later confirmed through classical and molecular taxonomy. The culture of this potential wild edible taxa was raised with an aim of domesticating it. Various solid and liquid media were evaluated for their vegetative growth, in which Malt Extract Agar was found to be the best solid medium and Glucose Peptone medium as the best liquid medium. The effect of different temperatures as well as pH was also evaluated for the vegetative growth of M. gigantea, and it was found that it shows maximum vegetative growth at 30° and pH 5. For spawn preparation, various grains viz. Wheat grains, Jowar grains, Bajra grains and Maize grains were evaluated, and it was found that wheat grains boiled for 30 minutes gave the maximum mycelial growth. Mother spawn was thus prepared on wheat grains boiled for 30 minutes. For raising the fruiting bodies, different locally available agro-wastes were tried, and it was found that paddy straw gives the best growth. Both wilds as well as cultivated M. gigantea were compared through HPLC to evaluate the different nutritional and nutraceutical values. For the evaluation of different sugars in wild and cultivated M. gigantea, 15 sugars were taken for analysis. Among these Melezitose, Trehalose, Glucose, Xylose and Mannitol were found in the wild collection of M. gigantea; in the cultivated sample, Melezitose, Trehalose, Xylose and Dulcitol were detected. Among the 20 different amino acids, 18 amino acids were found, except Asparagine and Glutamine in both wild as well as cultivated samples. Among the 37 tested fatty acids, only 6 fatty acids, namely Palmitic acid, Stearic acid, Cis-9 Oleic acid, Linoleic acid, Gamma-Linolenic acid and Tricosanoic acid, were found in both wild and cultivated samples, although the concentration of these fatty acids was more in the cultivated sample. From the various vitamins tested, Vitamin C, D and E were present in both wild and cultivated samples. Both wild as well as cultivated samples were evaluated for the presence of phenols; for this purpose, eleven phenols were taken as standards in HPLC analysis, and it was found that Gallic acid, Resorcinol, Ferulic acid and Pyrogallol were present in the wild mushroom sample whereas in the cultivated sample Ferulic acid, Caffeic Acid, Vanillic acid and Vanillin are present. The flavonoid analysis revealed the presence of Rutin, Naringin and Quercetin in wild M. gigantea, while 5 Naringin, Catechol, Myrecetin, Gossypin and Quercetin were found in cultivated one. From the comparative chromatographic profiling of both wild as well as cultivated M. gigantea, it is concluded that no nutrient loss was found during its cultivation. An increase in percentage of secondary metabolites (i.e., phenols and flavonoids) was found in cultivated one as compared to wild M. gigantea. Thus, from future perspective cultivated species of M. gigantea can be recommended for the commercial purpose as a good food supplement.

Keywords: culture, edible, fruit bodies, wild

Procedia PDF Downloads 63
8622 Monitoring Future Climate Changes Pattern over Major Cities in Ghana Using Coupled Modeled Intercomparison Project Phase 5, Support Vector Machine, and Random Forest Modeling

Authors: Stephen Dankwa, Zheng Wenfeng, Xiaolu Li

Abstract:

Climate change is recently gaining the attention of many countries across the world. Climate change, which is also known as global warming, referring to the increasing in average surface temperature has been a concern to the Environmental Protection Agency of Ghana. Recently, Ghana has become vulnerable to the effect of the climate change as a result of the dependence of the majority of the population on agriculture. The clearing down of trees to grow crops and burning of charcoal in the country has been a contributing factor to the rise in temperature nowadays in the country as a result of releasing of carbon dioxide and greenhouse gases into the air. Recently, petroleum stations across the cities have been on fire due to this climate changes and which have position Ghana in a way not able to withstand this climate event. As a result, the significant of this research paper is to project how the rise in the average surface temperature will be like at the end of the mid-21st century when agriculture and deforestation are allowed to continue for some time in the country. This study uses the Coupled Modeled Intercomparison Project phase 5 (CMIP5) experiment RCP 8.5 model output data to monitor the future climate changes from 2041-2050, at the end of the mid-21st century over the ten (10) major cities (Accra, Bolgatanga, Cape Coast, Koforidua, Kumasi, Sekondi-Takoradi, Sunyani, Ho, Tamale, Wa) in Ghana. In the models, Support Vector Machine and Random forest, where the cities as a function of heat wave metrics (minimum temperature, maximum temperature, mean temperature, heat wave duration and number of heat waves) assisted to provide more than 50% accuracy to predict and monitor the pattern of the surface air temperature. The findings identified were that the near-surface air temperature will rise between 1°C-2°C (degrees Celsius) over the coastal cities (Accra, Cape Coast, Sekondi-Takoradi). The temperature over Kumasi, Ho and Sunyani by the end of 2050 will rise by 1°C. In Koforidua, it will rise between 1°C-2°C. The temperature will rise in Bolgatanga, Tamale and Wa by 0.5°C by 2050. This indicates how the coastal and the southern part of the country are becoming hotter compared with the north, even though the northern part is the hottest. During heat waves from 2041-2050, Bolgatanga, Tamale, and Wa will experience the highest mean daily air temperature between 34°C-36°C. Kumasi, Koforidua, and Sunyani will experience about 34°C. The coastal cities (Accra, Cape Coast, Sekondi-Takoradi) will experience below 32°C. Even though, the coastal cities will experience the lowest mean temperature, they will have the highest number of heat waves about 62. Majority of the heat waves will last between 2 to 10 days with the maximum 30 days. The surface temperature will continue to rise by the end of the mid-21st century (2041-2050) over the major cities in Ghana and so needs to be addressed to the Environmental Protection Agency in Ghana in order to mitigate this problem.

Keywords: climate changes, CMIP5, Ghana, heat waves, random forest, SVM

Procedia PDF Downloads 196
8621 Combining Experiments and Surveys to Understand the Pinterest User Experience

Authors: Jolie M. Martin

Abstract:

Running experiments while logging detailed user actions has become the standard way of testing product features at Pinterest, as at many other Internet companies. While this technique offers plenty of statistical power to assess the effects of product changes on behavioral metrics, it does not often give us much insight into why users respond the way they do. By combining at-scale experiments with smaller surveys of users in each experimental condition, we have developed a unique approach for measuring the impact of our product and communication treatments on user sentiment, attitudes, and comprehension.

Keywords: experiments, methodology, surveys, user experience

Procedia PDF Downloads 309
8620 Thermodynamics of the Local Hadley Circulation Over Central Africa

Authors: Landry Tchambou Tchouongsi, Appolinaire Derbetini Vondou

Abstract:

This study describes the local Hadley circulation (HC) during the December-February (DJF) and June-August (JJA) seasons, respectively, in Central Africa (CA) from the divergent component of the mean meridional wind and also from a new method called the variation of the ψ vector. Historical data from the ERA5 reanalysis for the period 1983 to 2013 were used. The results show that the maximum of the upward branch of the local Hadley circulation in the DJF and JJA seasons is located under the Congo Basin (CB). However, seasonal and horizontal variations in the mean temperature gradient and thermodynamic properties are largely associated with the distribution of convection and large-scale upward motion. Thus, temperatures beneath the CB show a slight variation between the DJF and JJA seasons. Moreover, energy transport of the moist static energy (MSE) adequately captures the mean flow component of the HC over the tropics. By the way, the divergence under the CB is enhanced by the presence of the low pressure of western Cameroon and the contribution of the warm and dry air currents coming from the Sahara.

Keywords: Circulation, reanalysis, thermodynamic, local Hadley.

Procedia PDF Downloads 85
8619 Experimental Research of Biogas Production by Using Sewage Sludge and Chicken Manure Bioloadings with Wood Biochar Additive

Authors: P. Baltrenas, D. Paliulis, V. Kolodynskij, D. Urbanas

Abstract:

Bioreactor; special device, which is used for biogas production from various organic material under anaerobic conditions. In this research, a batch bioreactor with a mechanical mixer was used for biogas production from sewage sludge and chicken manure bioloadings. The process of anaerobic digestion was mesophilic (35 °C). Produced biogas was stoted in a gasholder and the concentration of its components was measured with INCA 4000 biogas analyser. Also, a specific additive (pine wood biochar) was applied to prepare bioloadings. The application of wood biochar in bioloading increases the CH₄ concentration in the produced gas by 6-7%. The highest concentrations of CH₄ were found in biogas produced during the decomposition of sewage sludge bioloadings. The maximum CH₄ reached 77.4%. Studies have shown that the application of biochar in bioloadings also reduces average CO₂ and H₂S concentrations in biogas.

Keywords: biochar, biogas, bioreactor, sewage sludge

Procedia PDF Downloads 161
8618 The United States Film Industry and Its Impact on Latin American Identity Rationalizations

Authors: Alfonso J. García Osuna

Abstract:

Background and Significance: The objective of this paper is to analyze the inception and development of identity archetypes in early XX century Latin America, to explore their roots in United States culture, to discuss the influences that came to bear upon Latin Americans as the United States began to export images of standard identity paradigms through its film industry, and to survey how these images evolved and impacted Latin Americans’ ideas of national distinctiveness from the early 1900s to the present. Therefore, the general hypothesis of this work is that United States film in many ways influenced national identity patterning in its neighbors, especially in those nations closest to its borders, Cuba and Mexico. Very little research has been done on the social impact of the United States film industry on the country’s southern neighbors. From a historical perspective, the US’s influence has been examined as the projection of political and economic power, that is to say, that American influence is seen as a catalyst to align the forces that the US wants to see wield the power of the State. But the subtle yet powerful cultural influence exercised by film, the eminent medium for exporting ideas and ideals in the XX century, has not been significantly explored. Basic Methodologies and Description: Gramscian Marxist theory underpins the study, where it is argued that film, as an exceptional vehicle for culture, is an important site of political and social struggle; in this context, it aims to show how United States capitalist structures of power not only use brute force to generate and maintain control of overseas markets, but also promote their ideas through artistic products such as film in order to infiltrate the popular culture of subordinated peoples. In this same vein, the work of neo-Marxist theoreticians of popular culture is employed in order to contextualize the agency of subordinated peoples in the process of cultural assimilations. Indication of the Major Findings of the Study: The study has yielded much data of interest. The salient finding is that each particular nation receives United States film according to its own particular social and political context, regardless of the amount of pressure exerted upon it. An example of this is the unmistakable dissimilarity between Cuban and Mexican reception of US films. The positive reception given in Cuba to American film has to do with the seamless acceptance of identity paradigms that, for historical reasons discussed herein, were incorporated into the national identity grid quite unproblematically. Such is not the case with Mexico, whose express rejection of identity paradigms offered by the United States reflects not only past conflicts with the northern neighbor, but an enduring recognition of the country’s indigenous roots, one that precluded such paradigms. Concluding Statement: This paper is an endeavor to elucidate the ways in which US film contributed to the outlining of Latin American identity blueprints, offering archetypes that would be accepted or rejected according to each nation’s particular social requirements, constraints and ethnic makeup.

Keywords: film studies, United States, Latin America, identity studies

Procedia PDF Downloads 291
8617 New Security Approach of Confidential Resources in Hybrid Clouds

Authors: Haythem Yahyaoui, Samir Moalla, Mounir Bouden, Skander ghorbel

Abstract:

Nowadays, Cloud environments are becoming a need for companies, this new technology gives the opportunities to access to the data anywhere and anytime, also an optimized and secured access to the resources and gives more security for the data which stored in the platform, however, some companies do not trust Cloud providers, in their point of view, providers can access and modify some confidential data such as bank accounts, many works have been done in this context, they conclude that encryption methods realized by providers ensure the confidentiality, although, they forgot that Cloud providers can decrypt the confidential resources. The best solution here is to apply some modifications on the data before sending them to the Cloud in the objective to make them unreadable. This work aims on enhancing the quality of service of providers and improving the trust of the customers.

Keywords: cloud, confidentiality, cryptography, security issues, trust issues

Procedia PDF Downloads 371
8616 Orthogonal Regression for Nonparametric Estimation of Errors-In-Variables Models

Authors: Anastasiia Yu. Timofeeva

Abstract:

Two new algorithms for nonparametric estimation of errors-in-variables models are proposed. The first algorithm is based on penalized regression spline. The spline is represented as a piecewise-linear function and for each linear portion orthogonal regression is estimated. This algorithm is iterative. The second algorithm involves locally weighted regression estimation. When the independent variable is measured with error such estimation is a complex nonlinear optimization problem. The simulation results have shown the advantage of the second algorithm under the assumption that true smoothing parameters values are known. Nevertheless the use of some indexes of fit to smoothing parameters selection gives the similar results and has an oversmoothing effect.

Keywords: grade point average, orthogonal regression, penalized regression spline, locally weighted regression

Procedia PDF Downloads 406
8615 On Dialogue Systems Based on Deep Learning

Authors: Yifan Fan, Xudong Luo, Pingping Lin

Abstract:

Nowadays, dialogue systems increasingly become the way for humans to access many computer systems. So, humans can interact with computers in natural language. A dialogue system consists of three parts: understanding what humans say in natural language, managing dialogue, and generating responses in natural language. In this paper, we survey deep learning based methods for dialogue management, response generation and dialogue evaluation. Specifically, these methods are based on neural network, long short-term memory network, deep reinforcement learning, pre-training and generative adversarial network. We compare these methods and point out the further research directions.

Keywords: dialogue management, response generation, deep learning, evaluation

Procedia PDF Downloads 158
8614 The Columbine Shooting in German Media Coverage: A Point of No Return

Authors: Melanie Verhovnik

Abstract:

School shootings are a well-known phenomenon in Germany, 14 of which have occurred to date. The first case happened half a year after the April 20th, 1999 Columbine shooting in the United States, which was at the time the most serious school shooting to have occurred anywhere in the world. The German media gave only scant attention to the subject of school shootings prior to Columbine, even though there were numerous instances of it throughout the world and several serious instances in the United States during the 1990s. A mixed method design of qualitative and quantitative content analysis was employed in order to demonstrate the main features and characteristics of core German media’s coverage of Columbine.

Keywords: Columbine, media coverage, qualitative, quantitative content analysis, school shooting

Procedia PDF Downloads 304
8613 Enhancing the Performance of Automatic Logistic Centers by Optimizing the Assignment of Material Flows to Workstations and Flow Racks

Authors: Sharon Hovav, Ilya Levner, Oren Nahum, Istvan Szabo

Abstract:

In modern large-scale logistic centers (e.g., big automated warehouses), complex logistic operations performed by human staff (pickers) need to be coordinated with the operations of automated facilities (robots, conveyors, cranes, lifts, flow racks, etc.). The efficiency of advanced logistic centers strongly depends on optimizing picking technologies in synch with the facility/product layout, as well as on optimal distribution of material flows (products) in the system. The challenge is to develop a mathematical operations research (OR) tool that will optimize system cost-effectiveness. In this work, we propose a model that describes an automatic logistic center consisting of a set of workstations located at several galleries (floors), with each station containing a known number of flow racks. The requirements of each product and the working capacity of stations served by a given set of workers (pickers) are assumed as predetermined. The goal of the model is to maximize system efficiency. The proposed model includes two echelons. The first is the setting of the (optimal) number of workstations needed to create the total processing/logistic system, subject to picker capacities. The second echelon deals with the assignment of the products to the workstations and flow racks, aimed to achieve maximal throughputs of picked products over the entire system given picker capacities and budget constraints. The solutions to the problems at the two echelons interact to balance the overall load in the flow racks and maximize overall efficiency. We have developed an operations research model within each echelon. In the first echelon, the problem of calculating the optimal number of workstations is formulated as a non-standard bin-packing problem with capacity constraints for each bin. The problem arising in the second echelon is presented as a constrained product-workstation-flow rack assignment problem with non-standard mini-max criteria in which the workload maximum is calculated across all workstations in the center and the exterior minimum is calculated across all possible product-workstation-flow rack assignments. The OR problems arising in each echelon are proved to be NP-hard. Consequently, we find and develop heuristic and approximation solution algorithms based on exploiting and improving local optimums. The LC model considered in this work is highly dynamic and is recalculated periodically based on updated demand forecasts that reflect market trends, technological changes, seasonality, and the introduction of new items. The suggested two-echelon approach and the min-max balancing scheme are shown to work effectively on illustrative examples and real-life logistic data.

Keywords: logistics center, product-workstation, assignment, maximum performance, load balancing, fast algorithm

Procedia PDF Downloads 222
8612 Lineup Optimization Model of Basketball Players Based on the Prediction of Recursive Neural Networks

Authors: Wang Yichen, Haruka Yamashita

Abstract:

In recent years, in the field of sports, decision making such as member in the game and strategy of the game based on then analysis of the accumulated sports data are widely attempted. In fact, in the NBA basketball league where the world's highest level players gather, to win the games, teams analyze the data using various statistical techniques. However, it is difficult to analyze the game data for each play such as the ball tracking or motion of the players in the game, because the situation of the game changes rapidly, and the structure of the data should be complicated. Therefore, it is considered that the analysis method for real time game play data is proposed. In this research, we propose an analytical model for "determining the optimal lineup composition" using the real time play data, which is considered to be difficult for all coaches. In this study, because replacing the entire lineup is too complicated, and the actual question for the replacement of players is "whether or not the lineup should be changed", and “whether or not Small Ball lineup is adopted”. Therefore, we propose an analytical model for the optimal player selection problem based on Small Ball lineups. In basketball, we can accumulate scoring data for each play, which indicates a player's contribution to the game, and the scoring data can be considered as a time series data. In order to compare the importance of players in different situations and lineups, we combine RNN (Recurrent Neural Network) model, which can analyze time series data, and NN (Neural Network) model, which can analyze the situation on the field, to build the prediction model of score. This model is capable to identify the current optimal lineup for different situations. In this research, we collected all the data of accumulated data of NBA from 2019-2020. Then we apply the method to the actual basketball play data to verify the reliability of the proposed model.

Keywords: recurrent neural network, players lineup, basketball data, decision making model

Procedia PDF Downloads 125
8611 Optimization of a Method of Total RNA Extraction from Mentha piperita

Authors: Soheila Afkar

Abstract:

Mentha piperita is a medicinal plant that contains a large amount of secondary metabolite that has adverse effect on RNA extraction. Since high quality of RNA is the first step to real time-PCR, in this study optimization of total RNA isolation from leaf tissues of Mentha piperita was evaluated. From this point of view, we researched two different total RNA extraction methods on leaves of Mentha piperita to find the best one that contributes the high quality. The methods tested are RNX-plus, modified RNX-plus (1-5 numbers). RNA quality was analyzed by agarose gel 1.5%. The RNA integrity was also assessed by visualization of ribosomal RNA bands on 1.5% agarose gels. In the modified RNX-plus method (number 2), the integrity of 28S and 18S rRNA was highly satisfactory when analyzed in agarose denaturing gel, so this method is suitable for RNA isolation from Mentha piperita.

Keywords: Mentha piperita, polyphenol, polysaccharide, RNA extraction

Procedia PDF Downloads 183
8610 A Study on Approximate Controllability of Impulsive Integrodifferential Systems with Non Local Conditions

Authors: Anandhi Santhosh

Abstract:

In order to describe various real-world problems in physical and engineering sciences subject to abrupt changes at certain instants during the evolution process, impulsive differential equations has been used to describe the system model. In this article, the problem of approximate controllability for nonlinear impulsive integrodifferential equations with state-dependent delay is investigated. We study the approximate controllability for nonlinear impulsive integrodifferential system under the assumption that the corresponding linear control system is approximately controllable. Using methods of functional analysis and semigroup theory, sufficient conditions are formulated and proved. Finally, an example is provided to illustrate the proposed theory.

Keywords: approximate controllability, impulsive differential system, fixed point theorem, state-dependent delay

Procedia PDF Downloads 379
8609 Railway Ballast Volumes Automated Estimation Based on LiDAR Data

Authors: Bahar Salavati Vie Le Sage, Ismaïl Ben Hariz, Flavien Viguier, Sirine Noura Kahil, Audrey Jacquin, Maxime Convert

Abstract:

The ballast layer plays a key role in railroad maintenance and the geometry of the track structure. Ballast also holds the track in place as the trains roll over it. Track ballast is packed between the sleepers and on the sides of railway tracks. An imbalance in ballast volume on the tracks can lead to safety issues as well as a quick degradation of the overall quality of the railway segment. If there is a lack of ballast in the track bed during the summer, there is a risk that the rails will expand and buckle slightly due to the high temperatures. Furthermore, the knowledge of the ballast quantities that will be excavated during renewal works is important for efficient ballast management. The volume of excavated ballast per meter of track can be calculated based on excavation depth, excavation width, volume of track skeleton (sleeper and rail) and sleeper spacing. Since 2012, SNCF has been collecting 3D points cloud data covering its entire railway network by using 3D laser scanning technology (LiDAR). This vast amount of data represents a modelization of the entire railway infrastructure, allowing to conduct various simulations for maintenance purposes. This paper aims to present an automated method for ballast volume estimation based on the processing of LiDAR data. The estimation of abnormal volumes in ballast on the tracks is performed by analyzing the cross-section of the track. Further, since the amount of ballast required varies depending on the track configuration, the knowledge of the ballast profile is required. Prior to track rehabilitation, excess ballast is often present in the ballast shoulders. Based on 3D laser scans, a Digital Terrain Model (DTM) was generated and automatic extraction of the ballast profiles from this data is carried out. The surplus in ballast is then estimated by performing a comparison between this ballast profile obtained empirically, and a geometric modelization of the theoretical ballast profile thresholds as dictated by maintenance standards. Ideally, this excess should be removed prior to renewal works and recycled to optimize the output of the ballast renewal machine. Based on these parameters, an application has been developed to allow the automatic measurement of ballast profiles. We evaluated the method on a 108 kilometers segment of railroad LiDAR scans, and the results show that the proposed algorithm detects ballast surplus that amounts to values close to the total quantities of spoil ballast excavated.

Keywords: ballast, railroad, LiDAR , cloud point, track ballast, 3D point

Procedia PDF Downloads 102
8608 Theorical Studies on the Structural Properties of 2,3-Bis(Furan-2-Yl)Pyrazino[2,3-F][1,10]Phenanthroline Derivaties

Authors: Zahra Sadeghian

Abstract:

This paper reports on the geometrical parameters optimized of the stationary point for the 2,3-Bis(furan-2-yl)pyrazino[2,3-f][1,10]phenanthroline. The calculations are performed using density functional theory (DFT) method at the B3LYP/LanL2DZ level. We determined bond lengths and bond angles values for the compound and calculate the amount of bond hybridization according to the natural bond orbital theory (NBO) too. The energy of frontier orbital (HOMO and LUMO) are computed. In addition, calculated data are accurately compared with the experimental result. This comparison show that the our theoretical data are in reasonable agreement with the experimental values.

Keywords: 2, 3-Bis(furan-2-yl)pyrazino[2, 3-f][1, 10]phenanthroline, density functional theory, theorical calculations, LanL2DZ level, B3LYP level

Procedia PDF Downloads 363
8607 Effect of Superabsorbent for the Improvement of Car Seat's Thermal Comfort

Authors: Funda Buyuk Mazari, Adnan Mazari, Antonin Havelka, Jakub Wiener, Jawad Naeem

Abstract:

The use of super absorbent polymers (SAP) for moisture absorption and comfort is still unexplored. In this research the efficiency of different SAP fibrous webs are determined under different moisture percentage to examine the sorption and desorption efficiency. The SAP fibrous web with low thickness and high moisture absorption are tested with multilayer sandwich structure of car seat cover to determine the moisture absorption through cover material. Sweating guarded hot plate (SGHP) from company Atlas is used to determine the moisture permeability of different car seat cover with superabsorbent layer closed with impermeable polyurethane foam. It is observed that the SAP fibrous layers are very effective in absorbing and desorbing water vapor under extreme high and low moisture percentages respectively. In extreme humid condition (95 %RH) the 20g of SAP layer absorbs nearly 3g of water vapor per hour and reaches the maximum absorption capacity in 6 hours.

Keywords: car seat, comfort, SAF, superabsorbent

Procedia PDF Downloads 463
8606 Mechanical Properties of Sugar Palm Fibre Reinforced Thermoplastic Polyurethane Composites

Authors: Dandi Bachtiar, Mohammed Ausama Abbas, Januar Parlaungan Siregar, Mohd Ruzaimi Bin Mat Rejab

Abstract:

Short sugar palm fibre and thermoplastic polyurethane were combined to produce new composites by using the extrude method. Two techniques used to prepare a new composite material, firstly, extrusion of the base material with short fibre, secondly hot pressing them. The size of sugar palm fibre was fixed at 250µm. Different weight percent (10 wt%, 20 wt% and 30 wt%) were used in order to optimise preparation process. The optimization of process depended on the characterization mechanical properties such as impact, tensile, and flexural of the new (TPU/SPF) composite material. The results proved that best tensile and impact properties of weight additive fibre applied 10 wt%. There was an increasing trend recorded of flexural properties during increased the fibre loading. Meanwhile, the maximum tensile strength was 14.0 MPa at 10 wt% of the fibre. Moreover, there was no significant effect for additions more than 30 wt% of the fibre.

Keywords: composites, natural fibre, polyurethane, sugar palm

Procedia PDF Downloads 375
8605 Ultracapacitor State-of-Energy Monitoring System with On-Line Parameter Identification

Authors: N. Reichbach, A. Kuperman

Abstract:

The paper describes a design of a monitoring system for super capacitor packs in propulsion systems, allowing determining the instantaneous energy capacity under power loading. The system contains real-time recursive-least-squares identification mechanism, estimating the values of pack capacitance and equivalent series resistance. These values are required for accurate calculation of the state-of-energy.

Keywords: real-time monitoring, RLS identification algorithm, state-of-energy, super capacitor

Procedia PDF Downloads 525
8604 Comparative Analysis of Two Approaches to Joint Signal Detection, ToA and AoA Estimation in Multi-Element Antenna Arrays

Authors: Olesya Bolkhovskaya, Alexey Davydov, Alexander Maltsev

Abstract:

In this paper two approaches to joint signal detection, time of arrival (ToA) and angle of arrival (AoA) estimation in multi-element antenna array are investigated. Two scenarios were considered: first one, when the waveform of the useful signal is known a priori and, second one, when the waveform of the desired signal is unknown. For first scenario, the antenna array signal processing based on multi-element matched filtering (MF) with the following non-coherent detection scheme and maximum likelihood (ML) parameter estimation blocks is exploited. For second scenario, the signal processing based on the antenna array elements covariance matrix estimation with the following eigenvector analysis and ML parameter estimation blocks is applied. The performance characteristics of both signal processing schemes are thoroughly investigated and compared for different useful signals and noise parameters.

Keywords: antenna array, signal detection, ToA, AoA estimation

Procedia PDF Downloads 487
8603 Milling Simulations with a 3-DOF Flexible Planar Robot

Authors: Hoai Nam Huynh, Edouard Rivière-Lorphèvre, Olivier Verlinden

Abstract:

Manufacturing technologies are becoming continuously more diversified over the years. The increasing use of robots for various applications such as assembling, painting, welding has also affected the field of machining. Machining robots can deal with larger workspaces than conventional machine-tools at a lower cost and thus represent a very promising alternative for machining applications. Furthermore, their inherent structure ensures them a great flexibility of motion to reach any location on the workpiece with the desired orientation. Nevertheless, machining robots suffer from a lack of stiffness at their joints restricting their use to applications involving low cutting forces especially finishing operations. Vibratory instabilities may also happen while machining and deteriorate the precision leading to scrap parts. Some researchers are therefore concerned with the identification of optimal parameters in robotic machining. This paper continues the development of a virtual robotic machining simulator in order to find optimized cutting parameters in terms of depth of cut or feed per tooth for example. The simulation environment combines an in-house milling routine (DyStaMill) achieving the computation of cutting forces and material removal with an in-house multibody library (EasyDyn) which is used to build a dynamic model of a 3-DOF planar robot with flexible links. The position of the robot end-effector submitted to milling forces is controlled through an inverse kinematics scheme while controlling the position of its joints separately. Each joint is actuated through a servomotor for which the transfer function has been computed in order to tune the corresponding controller. The output results feature the evolution of the cutting forces when the robot structure is deformable or not and the tracking errors of the end-effector. Illustrations of the resulting machined surfaces are also presented. The consideration of the links flexibility has highlighted an increase of the cutting forces magnitude. This proof of concept will aim to enrich the database of results in robotic machining for potential improvements in production.

Keywords: control, milling, multibody, robotic, simulation

Procedia PDF Downloads 240
8602 Assessment of Procurement-Demand of Milk Plant Using Quality Control Tools: A Case Study

Authors: Jagdeep Singh, Prem Singh

Abstract:

Milk is considered as an essential and complete food. The present study was conducted at Milk Plant Mohali especially in reference to the procurement section where the cash inflow was maximum, with the objective to achieve higher productivity and reduce wastage of milk. In milk plant it was observed that during the month of Jan-2014 to March-2014 the average procurement of milk was Rs. 4, 19, 361 liter per month and cost of procurement of milk is Rs 35/- per liter. The total cost of procurement thereby equal to Rs. 1crore 46 lakh per month, but there was mismatch in procurement-production of milk, which leads to an average loss of Rs. 12, 94, 405 per month. To solve the procurement-production problem Quality Control Tools like brainstorming, Flow Chart, Cause effect diagram and Pareto analysis are applied wherever applicable. With the successful implementation of Quality Control tools an average saving of Rs. 4, 59, 445 per month is done.

Keywords: milk, procurement-demand, quality control tools,

Procedia PDF Downloads 522
8601 The Analysis of Different Classes of Weighted Fuzzy Petri Nets and Their Features

Authors: Yurii Bloshko, Oksana Olar

Abstract:

This paper presents the analysis of 6 different classes of Petri nets: fuzzy Petri nets (FPN), generalized fuzzy Petri nets (GFPN), parameterized fuzzy Petri nets (PFPN), T2GFPN, flexible generalized fuzzy Petri nets (FGFPN), binary Petri nets (BPN). These classes were simulated in the special software PNeS® for the analysis of its pros and cons on the example of models which are dedicated to the decision-making process of passenger transport logistics. The paper includes the analysis of two approaches: when input values are filled with the experts’ knowledge; when fuzzy expectations represented by output values are added to the point. These approaches fulfill the possibilities of triples of functions which are replaced with different combinations of t-/s-norms.

Keywords: fuzzy petri net, intelligent computational techniques, knowledge representation, triangular norms

Procedia PDF Downloads 136
8600 Horse Chestnut Starch: A Noble Inedible Feedstock Source for Producing Thermoplastic Starch (TPS)

Authors: J. Castaño, S. Rodriguez, C. M. L. Franco

Abstract:

Starch isolated from non-edible A. hippocastanum seeds was characterized and used for preparing starch-based materials. The apparent amylose content of the isolated starch was 33.1%. The size of starch granules ranged from 0.7 to 35µm, and correlated with the shape of granules (spherical, oval and irregular). The chain length distribution profile of amylopectin showed two peaks, at polymerization degree (DP) of 12 and 41-43. Around 53% of branch unit chains had DP in the range of 11-20. A. hippocastanum starch displayed a typical C-type pattern and the maximum decomposition temperature was 317°C. Thermoplastic starch (TPS) prepared from A. hippocastanum with glycerol and processed by melt blending exhibited adequate mechanical and thermal properties. In contrast, plasticized TPS with glycerol:malic acid (1:1) showed lower thermal stability and a pasty and sticky behavior, indicating that malic acid accelerates degradation of starch during processing.

Keywords: Aesculus hippocastanum L., amylopectin structure, thermoplastic starch, non-edible source

Procedia PDF Downloads 365
8599 Different Methods of Producing Bioemulsifier by Bacillus licheniformis Strains

Authors: Saba Pajuhan, Afshin Farahbakhsh, S. M. M. Dastgheib

Abstract:

Biosurfactants and bioemulsifiers are a structurally diverse group of surface-active molecules synthesized by microorganisms, they are amphipathic molecules which reduce surface and interfacial tensions and widely used in pharmaceutical, cosmetic, food and petroleum industries. In this paper, several methods of bioemulsifer synthesis and purification by Bacillus licheniformis strains (namely ACO1, PTCC 1595 and ACO4) were investigated. Strains were grown in nutrient broth with different conditions in order to get maximum production of bioemulsifer. The purification of bio emulsifier and the quality evaluation of the product was done by adding sulfuric acid (H₂SO₄) (98%), Ethanol or HCl to the solution followed by centrifuging. To determine the optimal conditions yielding the highest bioemulsifier production, the effect of various carbon and nitrogen sources, temperature, NaCl concentration, pH, O₂ levels, incubation time are indispensable and all of them were highly effective in bioemulsifiers production.

Keywords: biosurfactant, bioemulsifier, purification, surface tension, interfacial tension

Procedia PDF Downloads 260
8598 Trip Reduction in Turbo Machinery

Authors: Pranay Mathur, Carlo Michelassi, Simi Karatha, Gilda Pedoto

Abstract:

Industrial plant uptime is top most importance for reliable, profitable & sustainable operation. Trip and failed start has major impact on plant reliability and all plant operators focussed on efforts required to minimise the trips & failed starts. The performance of these CTQs are measured with 2 metrics, MTBT(Mean time between trips) and SR (Starting reliability). These metrics helps to identify top failure modes and identify units need more effort to improve plant reliability. Baker Hughes Trip reduction program structured to reduce these unwanted trip 1. Real time machine operational parameters remotely available and capturing the signature of malfunction including related boundary condition. 2. Real time alerting system based on analytics available remotely. 3. Remote access to trip logs and alarms from control system to identify the cause of events. 4. Continuous support to field engineers by remotely connecting with subject matter expert. 5. Live tracking of key CTQs 6. Benchmark against fleet 7. Break down to the cause of failure to component level 8. Investigate top contributor, identify design and operational root cause 9. Implement corrective and preventive action 10. Assessing effectiveness of implemented solution using reliability growth models. 11. Develop analytics for predictive maintenance With this approach , Baker Hughes team is able to support customer in achieving their Reliability Key performance Indicators for monitored units, huge cost savings for plant operators. This Presentation explains these approach while providing successful case studies, in particular where 12nos. of LNG and Pipeline operators with about 140 gas compressing line-ups has adopted these techniques and significantly reduce the number of trips and improved MTBT

Keywords: reliability, availability, sustainability, digital infrastructure, weibull, effectiveness, automation, trips, fail start

Procedia PDF Downloads 68
8597 Erosion Modeling of Surface Water Systems for Long Term Simulations

Authors: Devika Nair, Sean Bellairs, Ken Evans

Abstract:

Flow and erosion modeling provides an avenue for simulating the fine suspended sediment in surface water systems like streams and creeks. Fine suspended sediment is highly mobile, and many contaminants that may have been released by any sort of catchment disturbance attach themselves to these sediments. Therefore, a knowledge of fine suspended sediment transport is important in assessing contaminant transport. The CAESAR-Lisflood Landform Evolution Model, which includes a hydrologic model (TOPMODEL) and a hydraulic model (Lisflood), is being used to assess the sediment movement in tropical streams on account of a disturbance in the catchment of the creek and to determine the dynamics of sediment quantity in the creek through the years by simulating the model for future years. The accuracy of future simulations depends on the calibration and validation of the model to the past and present events. Calibration and validation of the model involve finding a combination of parameters of the model, which, when applied and simulated, gives model outputs similar to those observed for the real site scenario for corresponding input data. Calibrating the sediment output of the CAESAR-Lisflood model at the catchment level and using it for studying the equilibrium conditions of the landform is an area yet to be explored. Therefore, the aim of the study was to calibrate the CAESAR-Lisflood model and then validate it so that it could be run for future simulations to study how the landform evolves over time. To achieve this, the model was run for a rainfall event with a set of parameters, plus discharge and sediment data for the input point of the catchment, to analyze how similar the model output would behave when compared with the discharge and sediment data for the output point of the catchment. The model parameters were then adjusted until the model closely approximated the real site values of the catchment. It was then validated by running the model for a different set of events and checking that the model gave similar results to the real site values. The outcomes demonstrated that while the model can be calibrated to a greater extent for hydrology (discharge output) throughout the year, the sediment output calibration may be slightly improved by having the ability to change parameters to take into account the seasonal vegetation growth during the start and end of the wet season. This study is important to assess hydrology and sediment movement in seasonal biomes. The understanding of sediment-associated metal dispersion processes in rivers can be used in a practical way to help river basin managers more effectively control and remediate catchments affected by present and historical metal mining.

Keywords: erosion modelling, fine suspended sediments, hydrology, surface water systems

Procedia PDF Downloads 81
8596 Covariate-Adjusted Response-Adaptive Designs for Semi-Parametric Survival Responses

Authors: Ayon Mukherjee

Abstract:

Covariate-adjusted response-adaptive (CARA) designs use the available responses to skew the treatment allocation in a clinical trial in towards treatment found at an interim stage to be best for a given patient's covariate profile. Extensive research has been done on various aspects of CARA designs with the patient responses assumed to follow a parametric model. However, ranges of application for such designs are limited in real-life clinical trials where the responses infrequently fit a certain parametric form. On the other hand, robust estimates for the covariate-adjusted treatment effects are obtained from the parametric assumption. To balance these two requirements, designs are developed which are free from distributional assumptions about the survival responses, relying only on the assumption of proportional hazards for the two treatment arms. The proposed designs are developed by deriving two types of optimum allocation designs, and also by using a distribution function to link the past allocation, covariate and response histories to the present allocation. The optimal designs are based on biased coin procedures, with a bias towards the better treatment arm. These are the doubly-adaptive biased coin design (DBCD) and the efficient randomized adaptive design (ERADE). The treatment allocation proportions for these designs converge to the expected target values, which are functions of the Cox regression coefficients that are estimated sequentially. These expected target values are derived based on constrained optimization problems and are updated as information accrues with sequential arrival of patients. The design based on the link function is derived using the distribution function of a probit model whose parameters are adjusted based on the covariate profile of the incoming patient. To apply such designs, the treatment allocation probabilities are sequentially modified based on the treatment allocation history, response history, previous patients’ covariates and also the covariates of the incoming patient. Given these information, an expression is obtained for the conditional probability of a patient allocation to a treatment arm. Based on simulation studies, it is found that the ERADE is preferable to the DBCD when the main aim is to minimize the variance of the observed allocation proportion and to maximize the power of the Wald test for a treatment difference. However, the former procedure being discrete tends to be slower in converging towards the expected target allocation proportion. The link function based design achieves the highest skewness of patient allocation to the best treatment arm and thus ethically is the best design. Other comparative merits of the proposed designs have been highlighted and their preferred areas of application are discussed. It is concluded that the proposed CARA designs can be considered as suitable alternatives to the traditional balanced randomization designs in survival trials in terms of the power of the Wald test, provided that response data are available during the recruitment phase of the trial to enable adaptations to the designs. Moreover, the proposed designs enable more patients to get treated with the better treatment during the trial thus making the designs more ethically attractive to the patients. An existing clinical trial has been redesigned using these methods.

Keywords: censored response, Cox regression, efficiency, ethics, optimal allocation, power, variability

Procedia PDF Downloads 159
8595 Effect of Design Parameters on a Two Stage Launch Vehicle Performance

Authors: Assem Sallam, Aly Elzahaby, Ahmed Makled, Mohamed Khalil

Abstract:

Change in design parameters of launch vehicle affects its overall flight path trajectory. In this paper, several design parameters are introduced to study their effect. Selected parameters are the launch vehicle mass, which is presented in the form of payload mass, the maximum allowable angle of attack the launch vehicle can withstand, the flight path angle that is predefined for the launch vehicle second stage, the required inclination and its effect on the launch azimuth and finally by changing the launch pad coordinate. Selected design parameters are studied for their effect on the variation of altitude, ground range, absolute velocity and the flight path angle. The study gives a general mean of adjusting the design parameters to reach the required launch vehicle performance.

Keywords: launch vehicle azimuth, launch vehicle trajectory, launch vehicle payload, launch pad location

Procedia PDF Downloads 306