Search results for: first order plusdead time process
13199 Application of a Time-Frequency-Based Blind Source Separation to an Instantaneous Mixture of Secondary Radar Sources
Authors: M. Tria, M. Benidir, E. Chaumette
Abstract:
In Secondary Surveillance Radar (SSR) systems, it is more difficult to locate and recognise aircrafts in the neighbourhood of civil airports since aerial traffic becomes greater. Here, we propose to apply a recent Blind Source Separation (BSS) algorithm based on Time-Frequency Analysis, in order to separate messages sent by different aircrafts and falling in the same radar beam in reception. The above source separation method involves joint-diagonalization of a set of smoothed version of spatial Wigner-Ville distributions. The technique makes use of the difference in the t-f signatures of the nonstationary sources to be separated. Consequently, as the SSR sources emit different messages at different frequencies, the above fitted to this new application. We applied the technique in simulation to separate SSR replies. Results are provided at the end of the paper.Keywords: Blind Source Separation, Time-Frequency Analysis, Secondary Radar
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 169513198 Modular Data and Calculation Framework for a Technology-Based Mapping of the Manufacturing Process According to the Value Stream Management Approach
Authors: Tim Wollert, Fabian Behrendt
Abstract:
Value Stream Management (VSM) is a widely used methodology in the context of Lean Management for improving end-to-end material and information flows from a supplier to a customer from a company’s perspective. Whereas the design principles, e.g. Pull, value-adding, customer-orientation and further ones are still valid against the background of an increasing digitalized and dynamic environment, the methodology itself for mapping a value stream is characterized as time- and resource-intensive due to the high degree of manual activities. The digitalization of processes in the context of Industry 4.0 enables new opportunities to reduce these manual efforts and make the VSM approach more agile. The paper at hand aims at providing a modular data and calculation framework, utilizing the available business data, provided by information and communication technologies for automizing the value stream mapping process with focus on the manufacturing process.
Keywords: Industry 4.0, lean management 4.0, value stream management 4.0, value stream mapping.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 36913197 A Comparative Study on Biochar from Slow Pyrolysis of Corn Cob and Cassava Wastes
Authors: Adilah Shariff, Nurhidayah Mohamed Noor, Alexander Lau, Muhammad Azwan Mohd Ali
Abstract:
Biomass such as corn and cassava wastes if left to decay will release significant quantities of greenhouse gases (GHG) including carbon dioxide and methane. The biomass wastes can be converted into biochar via thermochemical process such as slow pyrolysis. This approach can reduce the biomass wastes as well as preserve its carbon content. Biochar has the potential to be used as a carbon sequester and soil amendment. The aim of this study is to investigate the characteristics of the corn cob, cassava stem, and cassava rhizome in order to identify their potential as pyrolysis feedstocks for biochar production. This was achieved by using the proximate and elemental analyses as well as calorific value and lignocellulosic determination. The second objective is to investigate the effect of pyrolysis temperature on the biochar produced. A fixed bed slow pyrolysis reactor was used to pyrolyze the corn cob, cassava stem, and cassava rhizome. The pyrolysis temperatures were varied between 400 °C and 600 °C, while the heating rate and the holding time were fixed at 5 °C/min and 1 hour, respectively. Corn cob, cassava stem, and cassava rhizome were found to be suitable feedstocks for pyrolysis process because they contained a high percentage of volatile matter more than 80 mf wt.%. All the three feedstocks contained low nitrogen and sulphur content less than 1 mf wt.%. Therefore, during the pyrolysis process, the feedstocks give off very low rate of GHG such as nitrogen oxides and sulphur oxides. Independent of the types of biomass, the percentage of biochar yield is inversely proportional to the pyrolysis temperature. The highest biochar yield for each studied temperature is from slow pyrolysis of cassava rhizome as the feedstock contained the highest percentage of ash compared to the other two feedstocks. The percentage of fixed carbon in all the biochars increased as the pyrolysis temperature increased. The increment of pyrolysis temperature from 400 °C to 600 °C increased the fixed carbon of corn cob biochar, cassava stem biochar and cassava rhizome biochar by 26.35%, 10.98%, and 6.20% respectively. Irrespective of the pyrolysis temperature, all the biochars produced were found to contain more than 60 mf wt.% fixed carbon content, much higher than its feedstocks.
Keywords: Biochar, biomass, cassava wastes, corn cob, pyrolysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 215213196 Influence of Densification Process and Material Properties on Final Briquettes Quality from Fast-Growing Willows
Authors: Peter Križan, Juraj Beniak, Ľubomír Šooš, Miloš Matúš
Abstract:
Biomass treatment through densification is very suitable and helpful technology before its effective energy recovery. Densification process of biomass is significantly influenced by various technological and material variables, which are ultimately reflected on the final solid biofuels quality. The paper deals with the experimental research of the relationship between technological and material variables during densification of fast-growing trees, roundly fast-growing willows. The main goal of presented experimental research is to determine the relationship between compression pressure and raw material particle size from a final briquettes density point of view. Experimental research was realized by single-axis densification. The impact of particle size with interaction of compression pressure and stabilization time on the quality properties of briquettes was determined. These variables interaction affects the final solid biofuels (briquettes) quality. From briquettes production point of view and from densification machines constructions point of view is very important to know about mutual interaction of these variables on final briquettes quality. The experimental findings presented here are showing the importance of mentioned variables during the densification process.
Keywords: Briquettes density, densification, particle size, compression pressure, stabilization time.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 174313195 Improved Ant Colony Optimization for Solving Reliability Redundancy Allocation Problems
Authors: Phakhapong Thanitakul, Worawat Sa-ngiamvibool, Apinan Aurasopon, Saravuth Pothiya
Abstract:
This paper presents an improved ant colony optimization (IACO) for solving the reliability redundancy allocation problem (RAP) in order to maximize system reliability. To improve the performance of ACO algorithm, two additional techniques, i.e. neighborhood search, and re-initialization process are presented. To show its efficiency and effectiveness, the proposed IACO is applied to solve three RAPs. Additionally, the results of the proposed IACO are compared with those of the conventional heuristic approaches i.e. genetic algorithm (GA), particle swarm optimization (PSO) and ant colony optimization (ACO). The experimental results show that the proposed IACO approach is comparatively capable of obtaining higher quality solution and faster computational time.
Keywords: Ant colony optimization, Heuristic algorithm, Mixed-integer nonlinear programming, Redundancy allocation problem, Reliability optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 209413194 The Integration Process of Non-EU Citizens in Luxembourg: From an Empirical Approach Toward a Theoretical Model
Authors: Angela Odero, Chrysoula Karathanasi, Michèle Baumann
Abstract:
Integration of foreign communities has been a forefront issue in Luxembourg for some time now. The country’s continued progress depends largely on the successful integration of immigrants. The aim of our study was to analyze factors which intervene in the course of integration of Non-EU citizens through the discourse of Non-EU citizens residing in Luxembourg, who have signed the Welcome and Integration Contract (CAI). The two-year contract offers integration services to assist foreigners in getting settled in the country. Semi-structured focus group discussions with 50 volunteers were held in English, French, Spanish, Serbo-Croatian or Chinese. Participants were asked to talk about their integration experiences. Recorded then transcribed, the transcriptions were analyzed with the help of NVivo 10, a qualitative analysis software. A systematic and reiterative analysis of decomposing and reconstituting was realized through (1) the identification of predetermined categories (difficulties, challenges and integration needs) (2) initial coding – the grouping together of similar ideas (3) axial coding – the regrouping of items from the initial coding in new ways in order to create sub-categories and identify other core dimensions. Our results show that intervening factors include language acquisition, professional career and socio-cultural activities or events. Each of these factors constitutes different components whose weight shifts from person to person and from situation to situation. Connecting these three emergent factors are two elements essential to the success of the immigrant’s integration – the role of time and deliberate effort from the immigrants, the community, and the formal institutions charged with helping immigrants integrate. We propose a theoretical model where the factors described may be classified in terms of how they predispose, facilitate, and / or reinforce the process towards a successful integration. Measures currently in place propose one size fits all programs yet integrative measures which target the family unit and those customized to target groups based on their needs would work best.
Keywords: Integration, Integration Services, Non-EU citizens, Qualitative Analysis, Third Country Nationals.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 112613193 Factors Affecting Employee Decision Making in an AI Environment
Authors: Yogesh C. Sharma, A. Seetharaman
Abstract:
The decision-making process in humans is a complicated system influenced by a variety of intrinsic and extrinsic factors. Human decisions have a ripple effect on subsequent decisions. In this study, the scope of human decision making is limited to employees. In an organisation, a person makes a variety of decisions from the time they are hired to the time they retire. The goal of this research is to identify various elements that influence decision making. In addition, the environment in which a decision is made is a significant aspect of the decision-making process. Employees in today's workplace use artificial intelligence (AI) systems for automation and decision augmentation. The impact of AI systems on the decision-making process is examined in this study. This research is designed based on a systematic literature review. Based on gaps in the literature, limitations and the scope of future research have been identified. Based on these findings, a research framework has been designed to identify various factors affecting employee decision making. Employee decision making is influenced by technological advancement, data-driven culture, human trust, decision automation-augmentation and workplace motivation. Hybrid human-AI systems require development of new skill sets and organisational design. Employee psychological safety and supportive leadership influences overall job satisfaction.
Keywords: Employee decision making, artificial intelligence, environment, human trust, technology innovation, psychological safety.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 158413192 Specification of Attributes of a Multimedia Presentation for Presentation Manager
Authors: Veli Hakkoymaz, Alpaslan Altunköprü
Abstract:
A multimedia presentation system refers to the integration of a multimedia database with a presentation manager which has the functionality of content selection, organization and playout of multimedia presentations. It requires high performance of involved system components. Starting from multimedia information capture until the presentation delivery, high performance tools are required for accessing, manipulating, storing and retrieving these segments, for transferring and delivering them in a presentation terminal according to a playout order. The organization of presentations is a complex task in that the display order of presentation contents (in time and space) must be specified. A multimedia presentation contains audio, video, images and text media types. The critical decisions for presentation construction include what the contents are, how the contents are organized, and once the decision is made on the organization of the contents of the presentation, it must be conveyed to the end user in the correct organizational order and in a timely fashion. This paper introduces a framework for specification of multimedia presentations and describes the design of sample presentations using this framework from a multimedia database.
Keywords: Multimedia presentation, temporal specification, SMIL, spatial specification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 181413191 Optimizing the Project Delivery Time with Time Cost Trade-offs
Authors: Wei Lo, Ming-En Kuo
Abstract:
While to minimize the overall project cost is always one of the objectives of construction managers, to obtain the maximum economic return is definitely one the ultimate goals of the project investors. As there is a trade-off relationship between the project time and cost, and the project delivery time directly affects the timing of economic recovery of an investment project, to provide a method that can quantify the relationship between the project delivery time and cost, and identify the optimal delivery time to maximize economic return has always been the focus of researchers and industrial practitioners. Using genetic algorithms, this study introduces an optimization model that can quantify the relationship between the project delivery time and cost and furthermore, determine the optimal delivery time to maximize the economic return of the project. The results provide objective quantification for accurately evaluating the project delivery time and cost, and facilitate the analysis of the economic return of a project.Keywords: Time-Cost Trade-Off, Genetic Algorithms, Resource Integration, Economic return.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 177413190 Application of Advanced Oxidation Processes to Mefenamic Acid Elimination
Authors: Olga Gimeno, Javier Rivas, Angel Encinas, Fernando Beltran
Abstract:
The elimimation of mefenamic acid has been carried out by photolysis, ozonation, adsorption onto activated carbon (AC) and combinations of the previous single systems (O3+AC and O3+UV). The results obtained indicate that mefenamic acid is not photo-reactive, showing a relatively low quantum yield of the order of 6 x 10-4 mol Einstein-1. Application of ozone to mefenamic aqueous solutions instantaneously eliminates the pharmaceutical, achieving simultaneously a 40% of mineralization. Addition of AC to the ozonation process does not enhance the process, moreover, mineralization is completely inhibited if compared to results obtained by single ozonation. The combination of ozone and UV radiation led to the best results in terms of mineralization (60% after 120 min).Keywords: Photolysis, mefenamic acid, ozone, activated carbon.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 196513189 Stability and Kinetic Analysis during Vermicomposting of Sewage Sludge
Authors: Ashish Kumar Nayak, Dhamodharan K., Ajay S. Kalamdhad
Abstract:
The present study is aimed at alteration of sewage sludge into stable compost product using vermicomposting of sewage sludge mixed with cattle manure and saw dust in five different proportions based on C/N ratios (C/N 15 (R1), 20 (R2), 25 (R3) and 30 (R4); and control (R5)) by employing an epigeic earthworm Eisenia fetida. Higher reductions in C/N ratio, CO2 evolution and OUR were observed in R4 demonstrated the compost stability. In addition, R4 proved to be best combination for the growth of the earthworms. In order to observe the optimal degradation, kinetics for degradation of organic matter in vermicomposting were quantitatively evaluated. An approach model was developed by assuming that composting process is carried out in a homogeneous way and the kinetics for decomposition reaction is represented by a Monod-type equation. The results exhibit comparable variations in the kinetic constants Km and K3 under varying parameters during vermicomposting process. Results suggested that higher R2 value in R4, enhanced suitability towards Lineweaver-Burke plot. R4 yields higher degradability coefficient (K) reveals that the occurrence of optimal nutrient balance, which not only enhanced the affinity of enzymes towards substrate but also improved its degradation process. Therefore, it can be proved that R4 provided to be the best feed combination for vermicomposting process as compared to other reactors.
Keywords: Vermicomposting, Eisenia fetida, Sewage sludge, C/N ratio, Stability, Enzyme kinetics concept.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 235013188 Stability Analysis of Mutualism Population Model with Time Delay
Authors: Rusliza Ahmad, Harun Budin
Abstract:
This paper studies the effect of time delay on stability of mutualism population model with limited resources for both species. First, the stability of the model without time delay is analyzed. The model is then improved by considering a time delay in the mechanism of the growth rate of the population. We analyze the effect of time delay on the stability of the stable equilibrium point. Result showed that the time delay can induce instability of the stable equilibrium point, bifurcation and stability switches.Keywords: Bifurcation, Delay margin, Mutualism population model, Time delay
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 198413187 Genetic Algorithms Multi-Objective Model for Project Scheduling
Authors: Elsheikh Asser
Abstract:
Time and cost are the main goals of the construction project management. The first schedule developed may not be a suitable schedule for beginning or completing the project to achieve the target completion time at a minimum total cost. In general, there are trade-offs between time and cost (TCT) to complete the activities of a project. This research presents genetic algorithms (GAs) multiobjective model for project scheduling considering different scenarios such as least cost, least time, and target time.
Keywords: Genetic algorithms, Time-cost trade-off.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 232713186 Effects of Process Parameters on the Yield of Oil from Coconut Fruit
Authors: Ndidi F. Amulu, Godian O. Mbah, Maxwel I. Onyiah, Callistus N. Ude
Abstract:
Analysis of the properties of coconut (Cocos nucifera) and its oil was evaluated in this work using standard analytical techniques. The analyses carried out include proximate composition of the fruit, extraction of oil from the fruit using different process parameters and physicochemical analysis of the extracted oil. The results showed the percentage (%) moisture, crude lipid, crude protein, ash and carbohydrate content of the coconut as 7.59, 55.15, 5.65, 7.35 and 19.51 respectively. The oil from the coconut fruit was odourless and yellowish liquid at room temperature (30oC). The treatment combinations used (leaching time, leaching temperature and solute: solvent ratio) showed significant differences (P<0.05) in the yield of oil from coconut flour. The oil yield ranged between 36.25%-49.83%. Lipid indices of the coconut oil indicated the acid value (AV) as 10.05Na0H/g of oil, free fatty acid (FFA) as 5.03%, saponification values (SV) as 183.26mgKOH-1g of oil, iodine value (IV) as 81.00 I2/g of oil, peroxide value (PV) as 5.00 ml/ g of oil and viscosity (V) as 0.002. A standard statistical package minitab version 16.0 program was used in the regression analysis and analysis of variance (ANOVA). The statistical software mentioned above was also used to generate various plots such as single effect plot, interactions effect plot and contour plot. The response or yield of oil from the coconut flour was used to develop a mathematical model that correlates the yield to the process variables studied. The maximum conditions obtained that gave the highest yield of coconut oil were leaching time of 2hrs, leaching temperature of 50oC and solute/solvent ratio of 0.05g/ml.
Keywords: Coconut, oil-extraction, optimization, physicochemical, proximate.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 265113185 Steady State Simulation and Experimental Study of an Ethane Recovery Unit in an Iranian Natural Gas Refinery
Authors: Arash Esmaeili, Omid Ghabouli
Abstract:
The production and consumption of natural gas is on the rise throughout the world as a result of its wide availability, ease of transportation, use and clean-burning characteristics. The chief use of ethane is in the chemical industry in the production of Ethene (ethylene) by steam cracking. In this simulation, obtained ethane recovery percent based on Gas sub-cooled process (GSP) is 99.9 by mole that is included 32.1% by using de-methanizer column and 67.8% by de-ethanizer tower. The outstanding feature of this process is the novel split-vapor concept that employs to generate reflux for de-methanizer column. Remain amount of ethane in export gas cause rise in gross heating value up to 36.66 MJ/Nm3 in order to use in industrial and household consumptions.Keywords: Ethane recovery, Hydrocarbon dew point, Simulation, Water dew point
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 305313184 Investigation on the Physical Conditions of Façade Systems of Campus Buildings by Infrared Thermography Tests
Authors: N. Türkmenoğlu Bayraktar, E. Kishalı
Abstract:
Campus buildings are educational facilities where various amount of energy consumption for lighting, heating, cooling and ventilation occurs. Some of the new universities in Turkey, where this investigation takes place, still continue their educational activities in existing buildings primarily designed for different architectural programs and converted to campus buildings via changes of function, space organizations and structural interventions but most of the time without consideration of appropriate micro climatic conditions. Reducing energy consumption in these structures not only contributes to the national economy but also mitigates the negative effects on environment. Furthermore, optimum thermal comfort conditions should be provided during the refurbishment of existing campus structures and their building envelope. Considering this issue, the first step is to investigate the climatic performance of building elements regarding refurbishment process. In the context of the study Kocaeli University, Faculty of Design and Architecture building constructed in 1980s in Anıtpark campus located in the central part of Kocaeli, Turkey was investigated. Climatic factors influencing thermal conditions; the deteriorations on building envelope; temperature distribution; heat losses from façade elements observed by thermography were presented in order to improve strategies for retrofit process for the building envelope. Within the scope of the survey, refurbishment strategies towards providing optimum climatic comfort conditions, increasing energy efficiency of building envelope were proposed.
Keywords: Building envelope, IRT, refurbishment, non-destructive test.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 88713183 An Analysis of Genetic Algorithm Based Test Data Compression Using Modified PRL Coding
Authors: K. S. Neelukumari, K. B. Jayanthi
Abstract:
In this paper genetic based test data compression is targeted for improving the compression ratio and for reducing the computation time. The genetic algorithm is based on extended pattern run-length coding. The test set contains a large number of X value that can be effectively exploited to improve the test data compression. In this coding method, a reference pattern is set and its compatibility is checked. For this process, a genetic algorithm is proposed to reduce the computation time of encoding algorithm. This coding technique encodes the 2n compatible pattern or the inversely compatible pattern into a single test data segment or multiple test data segment. The experimental result shows that the compression ratio and computation time is reduced.Keywords: Backtracking, test data compression (TDC), x-filling, x-propagating and genetic algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 187013182 A Robust Al-Hawalees Gaming Automation using Minimax and BPNN Decision
Authors: Ahmad Sharieh, R Bremananth
Abstract:
Artificial Intelligence based gaming is an interesting topic in the state-of-art technology. This paper presents an automation of a tradition Omani game, called Al-Hawalees. Its related issues are resolved and implemented using artificial intelligence approach. An AI approach called mini-max procedure is incorporated to make a diverse budges of the on-line gaming. If number of moves increase, time complexity will be increased in terms of propositionally. In order to tackle the time and space complexities, we have employed a back propagation neural network (BPNN) to train in off-line to make a decision for resources required to fulfill the automation of the game. We have utilized Leverberg- Marquardt training in order to get the rapid response during the gaming. A set of optimal moves is determined by the on-line back propagation training fashioned with alpha-beta pruning. The results and analyses reveal that the proposed scheme will be easily incorporated in the on-line scenario with one player against the system.
Keywords: Artificial neural network, back propagation gaming, Leverberg-Marquardt, minimax procedure.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 193713181 Application of Gamma Frailty Model in Survival of Liver Cirrhosis Patients
Authors: Elnaz Saeedi, Jamileh Abolaghasemi, Mohsen Nasiri Tousi, Saeedeh Khosravi
Abstract:
Goals and Objectives: A typical analysis of survival data involves the modeling of time-to-event data, such as the time till death. A frailty model is a random effect model for time-to-event data, where the random effect has a multiplicative influence on the baseline hazard function. This article aims to investigate the use of gamma frailty model with concomitant variable in order to individualize the prognostic factors that influence the liver cirrhosis patients’ survival times. Methods: During the one-year study period (May 2008-May 2009), data have been used from the recorded information of patients with liver cirrhosis who were scheduled for liver transplantation and were followed up for at least seven years in Imam Khomeini Hospital in Iran. In order to determine the effective factors for cirrhotic patients’ survival in the presence of latent variables, the gamma frailty distribution has been applied. In this article, it was considering the parametric model, such as Exponential and Weibull distributions for survival time. Data analysis is performed using R software, and the error level of 0.05 was considered for all tests. Results: 305 patients with liver cirrhosis including 180 (59%) men and 125 (41%) women were studied. The age average of patients was 39.8 years. At the end of the study, 82 (26%) patients died, among them 48 (58%) were men and 34 (42%) women. The main cause of liver cirrhosis was found hepatitis 'B' with 23%, followed by cryptogenic with 22.6% were identified as the second factor. Generally, 7-year’s survival was 28.44 months, for dead patients and for censoring was 19.33 and 31.79 months, respectively. Using multi-parametric survival models of progressive and regressive, Exponential and Weibull models with regard to the gamma frailty distribution were fitted to the cirrhosis data. In both models, factors including, age, bilirubin serum, albumin serum, and encephalopathy had a significant effect on survival time of cirrhotic patients. Conclusion: To investigate the effective factors for the time of patients’ death with liver cirrhosis in the presence of latent variables, gamma frailty model with parametric distributions seems desirable.
Keywords: Frailty model, latent variables, liver cirrhosis, parametric distribution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 105813180 A Robust Software for Advanced Analysis of Space Steel Frames
Authors: Viet-Hung Truong, Seung-Eock Kim
Abstract:
This paper presents a robust software package for practical advanced analysis of space steel framed structures. The pre- and post-processors of the presented software package are coded in the C++ programming language while the solver is written by using the FORTRAN programming language. A user-friendly graphical interface of the presented software is developed to facilitate the modeling process and result interpretation of the problem. The solver employs the stability functions for capturing the second-order effects to minimize modeling and computational time. Both the plastic-hinge and fiber-hinge beam-column elements are available in the presented software. The generalized displacement control method is adopted to solve the nonlinear equilibrium equations.
Keywords: Advanced analysis, beam-column, fiber-hinge, plastic hinge, steel frame.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 146313179 Soliton Interaction in Birefringent Fibers with Third-Order Dispersion
Authors: Dowluru Ravi Kumar, Bhima Prabhakara Rao
Abstract:
Propagation of solitons in single-mode birefringent fibers is considered under the presence of third-order dispersion (TOD). The behavior of two neighboring solitons and their interaction is investigated under the presence of third-order dispersion with different group velocity dispersion (GVD) parameters. It is found that third-order dispersion makes the resultant soliton to deviate from its ideal position and increases the interaction between adjacent soliton pulses. It is also observed that this deviation due to third-order dispersion is considerably small when the optical pulse propagates at wavelengths relatively far from the zerodispersion. Modified coupled nonlinear Schrödinger-s equations (CNLSE) representing the propagation of optical pulse in single mode fiber with TOD are solved using split-step Fourier algorithm. The results presented in this paper reveal that the third-order dispersion can substantially increase the interaction between the solitons, but large group velocity dispersion reduces the interaction between neighboring solitons.
Keywords: Birefringence, Group velocity dispersion, Polarization mode dispersion, Soliton interaction, Third order dispersion.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 122513178 The Influence of using Compost Leachate on Soil Reaction
Authors: Ali Gholami, Shahram Ahmadi
Abstract:
In the area where the high quality water is not available, unconventional water sources are used to irrigate. Household leachate is one of the sources which are used in dry and semi dry areas in order to water the barer trees and plants. It meets the plants needs and also has some effects on the soil, but at the same time it might cause some problems as well. This study in order to evaluate the effect of using Compost leachate on the density of soil iron in form of a statistical pattern called ''Split Plot'' by using two main treatments, one subsidiary treatment and three repetitions of the pattern in a three month period. The main N treatments include: irrigation using well water as a blank treatments and the main I treatments include: irrigation using leachate and well water concurrently. Some subsidiary treatments were DI (Drop Irrigation) and SDI (Sub Drop Irrigation). Then in the established plots, 36 biannual pine and cypress shrubs were randomly grown. Two months later the treatment begins. The results revealed that there was a significant variation between the main treatment and the instance regarding pH decline in the soil which was related to the amount of leachate injected into the soil. After some time and using leachate the pH level fell, as much as 0.46 and also increased due to the great amounts of leachate. The underneath drop irrigation ends in better results than sub drop irrigation since it keeps the soil texture fixed.Keywords: Compost Leachate, Drop irrigation, Soil Reaction
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 192413177 Pre-Eliminary Design Adjustable Workstation for Piston Assembly Line Considering Anthropometric for Indonesian People
Authors: T. Yuri M. Zagloel, Inaki M. Hakim, A. M. Syarafi
Abstract:
Manufacturing process has been considered as one of the most important activity in business process. It correlates with productivity and quality of the product so industries could fulfill customer’s demand. With the increasing demand from customer, industries must improve their manufacturing ability such as shorten lead-time and reduce wastes on their process. Lean manufacturing has been considered as one of the tools to waste elimination in manufacturing or service industry. Workforce development is one practice in lean manufacturing that can reduce waste generated from operator such as waste of unnecessary motion. Anthropometric approach is proposed to determine the recommended measurement in operator’s work area. The method will get some dimensions from Indonesia people that related to piston workstation. The result from this research can be obtained new design for the work area considering ergonomic aspect.Keywords: Adjustable, anthropometric, ergonomic, waste.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 160813176 Double Clustering as an Unsupervised Approach for Order Picking of Distributed Warehouses
Authors: Hsin-Yi Huang, Ming-Sheng Liu, Jiun-Yan Shiau
Abstract:
Planning the order picking lists for warehouses to achieve some operational performances is a significant challenge when the costs associated with logistics are relatively high, and it is especially important in e-commerce era. Nowadays, many order planning techniques employ supervised machine learning algorithms. However, to define features for supervised machine learning algorithms is not a simple task. Against this background, we consider whether unsupervised algorithms can enhance the planning of order-picking lists. A double zone picking approach, which is based on using clustering algorithms twice, is developed. A simplified example is given to demonstrate the merit of our approach.
Keywords: order picking, warehouse, clustering, unsupervised learning
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 52013175 Fuzzy Logic and Control Strategies on a Sump
Authors: Nasser Mohamed Ramli, Nurul Izzati Zulkifli
Abstract:
Sump can be defined as a reservoir which contains slurry; a mixture of solid and liquid or water, in it. Sump system is an unsteady process owing to the level response. Sump level shall be monitored carefully by using a good controller to avoid overflow. The current conventional controllers would not be able to solve problems with large time delay and nonlinearities, Fuzzy Logic controller is tested to prove its ability in solving the listed problems of slurry sump. Therefore, in order to justify the effectiveness and reliability of these controllers, simulation of the sump system was created by using MATLAB and the results were compared. According to the result obtained, instead of Proportional-Integral (PI) and Proportional-Integral and Derivative (PID), Fuzzy Logic controller showed the best result by offering quick response of 0.32 s for step input and 5 s for pulse generator, by producing small Integral Absolute Error (IAE) values that are 0.66 and 0.36 respectively.
Keywords: Fuzzy, sump, level, controller.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 78213174 Statistical Optimization of Process Conditions for Disinfection of Water Using Defatted Moringa oleifera Seed Extract
Authors: Suleyman A. Muyibi, Munirat, A. Idris, Saedi Jami, Parveen Jamal, Mohd Ismail Abdul Karim
Abstract:
In this study, statistical optimization design was used to study the optimum disinfection parameters using defatted crude Moringa oleifera seed extracts against Escherichia coli (E. coli) bacterial cells. The classical one-factor-at-a-time (OFAT) and response surface methodology (RSM) was used. The possible optimum range of dosage, contact time and mixing rate from the OFAT study were 25mg/l to 200mg/l, 30minutes to 240 minutes and 100rpm to 160rpm respectively. Analysis of variance (ANOVA) of the statistical optimization using faced centered central composite design showed that dosage, contact time and mixing rate were highly significant. The optimum disinfection range was 125mg/l, at contact time of 30 minutes with mixing rate of 120 rpm.
Keywords: E.coli, disinfection, Moringa oleifera, response surface methodology.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 258913173 A Simplified and Effective Algorithm Used to Mine Similar Processes: An Illustrated Example
Authors: Min-Hsun Kuo, Yun-Shiow Chen
Abstract:
The running logs of a process hold valuable information about its executed activity behavior and generated activity logic structure. Theses informative logs can be extracted, analyzed and utilized to improve the efficiencies of the process's execution and conduction. One of the techniques used to accomplish the process improvement is called as process mining. To mine similar processes is such an improvement mission in process mining. Rather than directly mining similar processes using a single comparing coefficient or a complicate fitness function, this paper presents a simplified heuristic process mining algorithm with two similarity comparisons that are able to relatively conform the activity logic sequences (traces) of mining processes with those of a normalized (regularized) one. The relative process conformance is to find which of the mining processes match the required activity sequences and relationships, further for necessary and sufficient applications of the mined processes to process improvements. One similarity presented is defined by the relationships in terms of the number of similar activity sequences existing in different processes; another similarity expresses the degree of the similar (identical) activity sequences among the conforming processes. Since these two similarities are with respect to certain typical behavior (activity sequences) occurred in an entire process, the common problems, such as the inappropriateness of an absolute comparison and the incapability of an intrinsic information elicitation, which are often appeared in other process conforming techniques, can be solved by the relative process comparison presented in this paper. To demonstrate the potentiality of the proposed algorithm, a numerical example is illustrated.Keywords: process mining, process similarity, artificial intelligence, process conformance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 144313172 Applying Fuzzy FP-Growth to Mine Fuzzy Association Rules
Authors: Chien-Hua Wang, Wei-Hsuan Lee, Chin-Tzong Pang
Abstract:
In data mining, the association rules are used to find for the associations between the different items of the transactions database. As the data collected and stored, rules of value can be found through association rules, which can be applied to help managers execute marketing strategies and establish sound market frameworks. This paper aims to use Fuzzy Frequent Pattern growth (FFP-growth) to derive from fuzzy association rules. At first, we apply fuzzy partition methods and decide a membership function of quantitative value for each transaction item. Next, we implement FFP-growth to deal with the process of data mining. In addition, in order to understand the impact of Apriori algorithm and FFP-growth algorithm on the execution time and the number of generated association rules, the experiment will be performed by using different sizes of databases and thresholds. Lastly, the experiment results show FFPgrowth algorithm is more efficient than other existing methods.Keywords: Data mining, association rule, fuzzy frequent patterngrowth.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 180013171 Texture Feature Extraction of Infrared River Ice Images using Second-Order Spatial Statistics
Authors: Bharathi P. T, P. Subashini
Abstract:
Ice cover County has a significant impact on rivers as it affects with the ice melting capacity which results in flooding, restrict navigation, modify the ecosystem and microclimate. River ices are made up of different ice types with varying ice thickness, so surveillance of river ice plays an important role. River ice types are captured using infrared imaging camera which captures the images even during the night times. In this paper the river ice infrared texture images are analysed using first-order statistical methods and secondorder statistical methods. The second order statistical methods considered are spatial gray level dependence method, gray level run length method and gray level difference method. The performance of the feature extraction methods are evaluated by using Probabilistic Neural Network classifier and it is found that the first-order statistical method and second-order statistical method yields low accuracy. So the features extracted from the first-order statistical method and second-order statistical method are combined and it is observed that the result of these combined features (First order statistical method + gray level run length method) provides higher accuracy when compared with the features from the first-order statistical method and second-order statistical method alone.
Keywords: Gray Level Difference Method, Gray Level Run Length Method, Kurtosis, Probabilistic Neural Network, Skewness, Spatial Gray Level Dependence Method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 290913170 Developments for ''Virtual'' Monitoring and Process Simulation of the Cryogenic Pilot Plant
Authors: Carmen Maria Moraru, Iuliana Stefan, Ovidiu Balteanu, Ciprian Bucur, Liviu Stefan, Anisia Bornea, Ioan Stefanescu
Abstract:
The implementation of the new software and hardware-s technologies for tritium processing nuclear plants, and especially those with an experimental character or of new technology developments shows a coefficient of complexity due to issues raised by the implementation of the performing instrumentation and equipment into a unitary monitoring system of the nuclear technological process of tritium removal. Keeping the system-s flexibility is a demand of the nuclear experimental plants for which the change of configuration, process and parameters is something usual. The big amount of data that needs to be processed stored and accessed for real time simulation and optimization demands the achievement of the virtual technologic platform where the data acquiring, control and analysis systems of the technological process can be integrated with a developed technological monitoring system. Thus, integrated computing and monitoring systems needed for the supervising of the technological process will be executed, to be continued with the execution of optimization system, by choosing new and performed methods corresponding to the technological processes within the tritium removal processing nuclear plants. The developing software applications is executed with the support of the program packages dedicated to industrial processes and they will include acquisition and monitoring sub-modules, named “virtually" as well as the storage sub-module of the process data later required for the software of optimization and simulation of the technological process for tritium removal. The system plays and important role in the environment protection and durable development through new technologies, that is – the reduction of and fight against industrial accidents in the case of tritium processing nuclear plants. Research for monitoring optimisation of nuclear processes is also a major driving force for economic and social development.
Keywords: Monitoring system, process simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1973