Search results for: work in process (WIP)
7418 Mathematical Expression for Machining Performance
Authors: Md. Ashikur Rahman Khan, M. M. Rahman
Abstract:
In electrical discharge machining (EDM), a complete and clear theory has not yet been established. The developed theory (physical models) yields results far from reality due to the complexity of the physics. It is difficult to select proper parameter settings in order to achieve better EDM performance. However, modelling can solve this critical problem concerning the parameter settings. Therefore, the purpose of the present work is to develop mathematical model to predict performance characteristics of EDM on Ti-5Al-2.5Sn titanium alloy. Response surface method (RSM) and artificial neural network (ANN) are employed to develop the mathematical models. The developed models are verified through analysis of variance (ANOVA). The ANN models are trained, tested, and validated utilizing a set of data. It is found that the developed ANN and mathematical model can predict performance of EDM effectively. Thus, the model has found a precise tool that turns EDM process cost-effective and more efficient.
Keywords: Analysis of variance, artificial neural network, material removal rate, modelling, response surface method, surface finish.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7317417 A Review on Building Information Modelling in Nigeria and Its Potentials
Authors: Mansur Hamma-Adama, Tahar Kouider
Abstract:
Construction Industry has been evolving since the development of Building Information Modelling (BIM). This technological process is unstoppable; it is out to the market with remarkable case studies of solving the long industry’s history of fragmentation. This industry has been changing over time; United States has recorded the most significant development in construction digitalization, Australia, United Kingdom and some other developed nations are also amongst promoters of BIM process and its development. Recently, a developing country like China and Malaysia are keying into the industry’s digital shift, while very little move is seen in South Africa whose development is considered higher and perhaps leader in the digital transition amongst the African countries. To authors’ best knowledge, Nigerian construction industry has never engaged in BIM discussions hence has no attention at national level. Consequently, Nigeria has no “Noteworthy BIM publications.” Decision makers and key stakeholders need to be informed on the current trend of the industry’s development (BIM in specific) and the opportunities of adopting this digitalization trend in relation to the identified challenges. BIM concept can be traced mostly in Architectural practices than engineering practices in Nigeria. A superficial BIM practice is found to be at organisational level only and operating a model based - “BIM stage 1.” Research to adopting this innovation has received very little attention. This piece of work is literature review based, aimed at exploring BIM in Nigeria and its prospects. The exploration reveals limitations in the literature availability as to extensive research in the development of BIM in the country. Numerous challenges were noticed including building collapse, inefficiencies, cost overrun and late project delivery. BIM has potentials to overcome the above challenges and even beyond. Low level of BIM adoption with reasonable level of awareness is noticed. However, lack of policy and guideline as well as serious lack of experts in the field are amongst the major barriers to BIM adoption. The industry needs to embrace BIM to possibly compete with its global counterpart.
Keywords: Adoption, BIM, CAD, construction industry, Nigeria, opportunities.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13757416 A Digital Twin Approach for Sustainable Territories Planning: A Case Study on District Heating
Authors: A. Amrani, O. Allali, A. Ben Hamida, F. Defrance, S. Morland, E. Pineau, T. Lacroix
Abstract:
The energy planning process is a very complex task that involves several stakeholders and requires the consideration of several local and global factors and constraints. In order to optimize and simplify this process, we propose a tool-based iterative approach applied to district heating planning. We build our tool with the collaboration of a French territory using actual district data and implementing the European incentives. We set up an iterative process including data visualization and analysis, identification and extraction of information related to the area concerned by the operation, design of sustainable planning scenarios leveraging local renewable and recoverable energy sources, and finally, the evaluation of scenarios. The last step is performed by a dynamic digital twin replica of the city. Territory’s energy experts confirm that the tool provides them with valuable support towards sustainable energy planning.
Keywords: Climate change, data management, decision support, digital twin, district heating, energy planning, renewables, smart city.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6547415 Detecting and Tracking Vehicles in Airborne Videos
Authors: Hsu-Yung Cheng, Chih-Chang Yu
Abstract:
In this work, we present an automatic vehicle detection system for airborne videos using combined features. We propose a pixel-wise classification method for vehicle detection using Dynamic Bayesian Networks. In spite of performing pixel-wise classification, relations among neighboring pixels in a region are preserved in the feature extraction process. The main novelty of the detection scheme is that the extracted combined features comprise not only pixel-level information but also region-level information. Afterwards, tracking is performed on the detected vehicles. Tracking is performed using efficient Kalman filter with dynamic particle sampling. Experiments were conducted on a wide variety of airborne videos. We do not assume prior information of camera heights, orientation, and target object sizes in the proposed framework. The results demonstrate flexibility and good generalization abilities of the proposed method on a challenging dataset.Keywords: Vehicle Detection, Airborne Video, Tracking, Dynamic Bayesian Networks
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15877414 Burstiness Reduction of a Doubly Stochastic AR-Modeled Uniform Activity VBR Video
Authors: J. P. Dubois
Abstract:
Stochastic modeling of network traffic is an area of significant research activity for current and future broadband communication networks. Multimedia traffic is statistically characterized by a bursty variable bit rate (VBR) profile. In this paper, we develop an improved model for uniform activity level video sources in ATM using a doubly stochastic autoregressive model driven by an underlying spatial point process. We then examine a number of burstiness metrics such as the peak-to-average ratio (PAR), the temporal autocovariance function (ACF) and the traffic measurements histogram. We found that the former measure is most suitable for capturing the burstiness of single scene video traffic. In the last phase of this work, we analyse statistical multiplexing of several constant scene video sources. This proved, expectedly, to be advantageous with respect to reducing the burstiness of the traffic, as long as the sources are statistically independent. We observed that the burstiness was rapidly diminishing, with the largest gain occuring when only around 5 sources are multiplexed. The novel model used in this paper for characterizing uniform activity video was thus found to be an accurate model.Keywords: AR, ATM, burstiness, doubly stochastic, statisticalmultiplexing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14097413 Visual Analytics in K 12 Education - Emerging Dimensions of Complexity
Authors: Linnea Stenliden
Abstract:
The aim of this paper is to understand emerging learning conditions, when a visual analytics is implemented and used in K 12 (education). To date, little attention has been paid to the role visual analytics (digital media and technology that highlight visual data communication in order to support analytical tasks) can play in education, and to the extent to which these tools can process actionable data for young students. This study was conducted in three public K 12 schools, in four social science classes with students aged 10 to 13 years, over a period of two to four weeks at each school. Empirical data were generated using video observations and analyzed with help of metaphors within Actor-network theory (ANT). The learning conditions are found to be distinguished by broad complexity, characterized by four dimensions. These emerge from the actors’ deeply intertwined relations in the activities. The paper argues in relation to the found dimensions that novel approaches to teaching and learning could benefit students’ knowledge building as they work with visual analytics, analyzing visualized data.
Keywords: Analytical reasoning, complexity, data use, problem space, visual analytics, visual storytelling, translation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16977412 Memory Leak Detection in Distributed System
Authors: Roohi Shabrin S., Devi Prasad B., Prabu D., Pallavi R. S., Revathi P.
Abstract:
Due to memory leaks, often-valuable system memory gets wasted and denied for other processes thereby affecting the computational performance. If an application-s memory usage exceeds virtual memory size, it can leads to system crash. Current memory leak detection techniques for clusters are reactive and display the memory leak information after the execution of the process (they detect memory leak only after it occur). This paper presents a Dynamic Memory Monitoring Agent (DMMA) technique. DMMA framework is a dynamic memory leak detection, that detects the memory leak while application is in execution phase, when memory leak in any process in the cluster is identified by DMMA it gives information to the end users to enable them to take corrective actions and also DMMA submit the affected process to healthy node in the system. Thus provides reliable service to the user. DMMA maintains information about memory consumption of executing processes and based on this information and critical states, DMMA can improve reliability and efficaciousness of cluster computing.Keywords: Dynamic Memory Monitoring Agent (DMMA), Cluster Computing, Memory Leak, Fault Tolerant Framework, Dynamic Memory Leak Detection (DMLD).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22847411 Performance Comparison of Different Regression Methods for a Polymerization Process with Adaptive Sampling
Authors: Florin Leon, Silvia Curteanu
Abstract:
Developing complete mechanistic models for polymerization reactors is not easy, because complex reactions occur simultaneously; there is a large number of kinetic parameters involved and sometimes the chemical and physical phenomena for mixtures involving polymers are poorly understood. To overcome these difficulties, empirical models based on sampled data can be used instead, namely regression methods typical of machine learning field. They have the ability to learn the trends of a process without any knowledge about its particular physical and chemical laws. Therefore, they are useful for modeling complex processes, such as the free radical polymerization of methyl methacrylate achieved in a batch bulk process. The goal is to generate accurate predictions of monomer conversion, numerical average molecular weight and gravimetrical average molecular weight. This process is associated with non-linear gel and glass effects. For this purpose, an adaptive sampling technique is presented, which can select more samples around the regions where the values have a higher variation. Several machine learning methods are used for the modeling and their performance is compared: support vector machines, k-nearest neighbor, k-nearest neighbor and random forest, as well as an original algorithm, large margin nearest neighbor regression. The suggested method provides very good results compared to the other well-known regression algorithms.Keywords: Adaptive sampling, batch bulk methyl methacrylate polymerization, large margin nearest neighbor regression, machine learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14017410 Towards a New Methodology for Developing Web-Based Systems
Authors: Omer Ishag Eldai, Ahmed Hassan M. H. Ali, S. Raviraja
Abstract:
Web-based systems have become increasingly important due to the fact that the Internet and the World Wide Web have become ubiquitous, surpassing all other technological developments in our history. The Internet and especially companies websites has rapidly evolved in their scope and extent of use, from being a little more than fixed advertising material, i.e. a "web presences", which had no particular influence for the company's business, to being one of the most essential parts of the company's core business. Traditional software engineering approaches with process models such as, for example, CMM and Waterfall models, do not work very well since web system development differs from traditional development. The development differs in several ways, for example, there is a large gap between traditional software engineering designs and concepts and the low-level implementation model, many of the web based system development activities are business oriented (for example web application are sales-oriented, web application and intranets are content-oriented) and not engineering-oriented. This paper aims to introduce Increment Iterative extreme Programming (IIXP) methodology for developing web based systems. In difference to the other existence methodologies, this methodology is combination of different traditional and modern software engineering and web engineering principles.Keywords: Web based systems, Web engineering.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18527409 NiO-CeO2 Nano-Catalyst for the Removal of Priority Organic Pollutants from Wastewater through Catalytic Wet Air Oxidation at Mild Conditions
Authors: Anushree, Chhaya Sharma, Satish Kumar
Abstract:
Catalytic wet air oxidation (CWAO) is normally carried out at elevated temperature and pressure. This work investigates the potential of NiO-CeO2 nano-catalyst in CWAO of paper industry wastewater under milder operating conditions of 90 °C and 1 atm. The NiO-CeO2 nano-catalysts were synthesized by a simple co-precipitation method and characterized by X-ray diffraction (XRD), before and after use, in order to study any crystallographic change during experiment. The extent of metal-leaching from the catalyst was determined using the inductively coupled plasma optical emission spectrometry (ICP-OES). The catalytic activity of nano-catalysts was studied in terms of total organic carbon (TOC), adsorbable organic halides (AOX) and chlorophenolics (CHPs) removal. Interestingly, mixed oxide catalysts exhibited higher activity than the corresponding single-metal oxides. The maximum removal efficiency was achieved with Ce40Ni60 catalyst. The results indicate that the CWAO process is efficient in removing the priority organic pollutants from wastewater, as it exhibited up to 59% TOC, 55% AOX, and 54 % CHPs removal.
Keywords: Nano-materials, NiO-CeO2, wastewater, wet air oxidation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13757408 A New Hybrid K-Mean-Quick Reduct Algorithm for Gene Selection
Authors: E. N. Sathishkumar, K. Thangavel, T. Chandrasekhar
Abstract:
Feature selection is a process to select features which are more informative. It is one of the important steps in knowledge discovery. The problem is that all genes are not important in gene expression data. Some of the genes may be redundant, and others may be irrelevant and noisy. Here a novel approach is proposed Hybrid K-Mean-Quick Reduct (KMQR) algorithm for gene selection from gene expression data. In this study, the entire dataset is divided into clusters by applying K-Means algorithm. Each cluster contains similar genes. The high class discriminated genes has been selected based on their degree of dependence by applying Quick Reduct algorithm to all the clusters. Average Correlation Value (ACV) is calculated for the high class discriminated genes. The clusters which have the ACV value as 1 is determined as significant clusters, whose classification accuracy will be equal or high when comparing to the accuracy of the entire dataset. The proposed algorithm is evaluated using WEKA classifiers and compared. The proposed work shows that the high classification accuracy.
Keywords: Clustering, Gene Selection, K-Mean-Quick Reduct, Rough Sets.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22987407 Co-existence of Thai Muslim People and Other in an Ancient Community Located in the Heart of Bangkok: The Case Study of Petchaburi 7 Community
Authors: Saowapa Phaithayawat
Abstract:
The objectives of study are the following: To study the way of life in terms of one hundred years co-existence of the Muslim and local community in this area 2) To analyze factors affect to this community with happy co-existence. The study requires quantitative research to study a history together with the study of humanity. The result of this study showed that the area of Petchburi 7 community is an ancient area which has owned by the Muslim for almost 100 years. There is a sanctuary as & center of unity. Later Bangkok becomes developed and provides more infrastructures like motorway and other transportation: however, the owners of lands in this community still keep their lands and build many buildings to run business. With this purpose, there are many non-Muslim people come to live here with co-existence. Not only are they convenient to work but also easy to transport by sky train. There are factors that make them live harmonious as following: 1) All Muslims in this area are strict to follow their rules and allocate their community for business. 2) All people, who come and live here, are middle-aged and working men and women. They, rent rooms closed to their work. 3) There are Muslim food and desserts, especially Roti, the popular fried flour, and local Chachak, tea originated from the south of Thailand. All these food and desserts are famous for working men and women to home and join after work 4) All Muslim in this area are independent to lead their own lives although a society changes rapidly.
Keywords: Co-existence, Muslim and other group of people, the ancient community.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17547406 The Use of Performance Indicators for Evaluating Models of Drying Jackfruit (Artocarpus heterophyllus L.): Page, Midilli, and Lewis
Authors: D. S. C. Soares, D. G. Costa, J. T. S., A. K. S. Abud, T. P. Nunes, A. M. Oliveira Júnior
Abstract:
Mathematical models of drying are used for the purpose of understanding the drying process in order to determine important parameters for design and operation of the dryer. The jackfruit is a fruit with high consumption in the Northeast and perishability. It is necessary to apply techniques to improve their conservation for longer in order to diffuse it by regions with low consumption. This study aimed to analyze several mathematical models (Page, Lewis, and Midilli) to indicate one that best fits the conditions of convective drying process using performance indicators associated with each model: accuracy (Af) and noise factors (Bf), mean square error (RMSE) and standard error of prediction (% SEP). Jackfruit drying was carried out in convective type tray dryer at a temperature of 50°C for 9 hours. It is observed that the model Midili was more accurate with Af: 1.39, Bf: 1.33, RMSE: 0.01%, and SEP: 5.34. However, the use of the Model Midilli is not appropriate for purposes of control process due to need four tuning parameters. With the performance indicators used in this paper, the Page model showed similar results with only two parameters. It is concluded that the best correlation between the experimental and estimated data is given by the Page’s model.
Keywords: Drying, models, jackfruit.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24227405 Handwriting Velocity Modeling by Artificial Neural Networks
Authors: Mohamed Aymen Slim, Afef Abdelkrim, Mohamed Benrejeb
Abstract:
The handwriting is a physical demonstration of a complex cognitive process learnt by man since his childhood. People with disabilities or suffering from various neurological diseases are facing so many difficulties resulting from problems located at the muscle stimuli (EMG) or signals from the brain (EEG) and which arise at the stage of writing. The handwriting velocity of the same writer or different writers varies according to different criteria: age, attitude, mood, writing surface, etc. Therefore, it is interesting to reconstruct an experimental basis records taking, as primary reference, the writing speed for different writers which would allow studying the global system during handwriting process. This paper deals with a new approach of the handwriting system modeling based on the velocity criterion through the concepts of artificial neural networks, precisely the Radial Basis Functions (RBF) neural networks. The obtained simulation results show a satisfactory agreement between responses of the developed neural model and the experimental data for various letters and forms then the efficiency of the proposed approaches.
Keywords: ElectroMyoGraphic (EMG) signals, Experimental approach, Handwriting process, Radial Basis Functions (RBF) neural networks, Velocity Modeling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23157404 On the Parameter Optimization of Fuzzy Inference Systems
Authors: Erika Martinez Ramirez, Rene V. Mayorga
Abstract:
Nowadays, more engineering systems are using some kind of Artificial Intelligence (AI) for the development of their processes. Some well-known AI techniques include artificial neural nets, fuzzy inference systems, and neuro-fuzzy inference systems among others. Furthermore, many decision-making applications base their intelligent processes on Fuzzy Logic; due to the Fuzzy Inference Systems (FIS) capability to deal with problems that are based on user knowledge and experience. Also, knowing that users have a wide variety of distinctiveness, and generally, provide uncertain data, this information can be used and properly processed by a FIS. To properly consider uncertainty and inexact system input values, FIS normally use Membership Functions (MF) that represent a degree of user satisfaction on certain conditions and/or constraints. In order to define the parameters of the MFs, the knowledge from experts in the field is very important. This knowledge defines the MF shape to process the user inputs and through fuzzy reasoning and inference mechanisms, the FIS can provide an “appropriate" output. However an important issue immediately arises: How can it be assured that the obtained output is the optimum solution? How can it be guaranteed that each MF has an optimum shape? A viable solution to these questions is through the MFs parameter optimization. In this Paper a novel parameter optimization process is presented. The process for FIS parameter optimization consists of the five simple steps that can be easily realized off-line. Here the proposed process of FIS parameter optimization it is demonstrated by its implementation on an Intelligent Interface section dealing with the on-line customization / personalization of internet portals applied to E-commerce.Keywords: Artificial Intelligence, Fuzzy Logic, Fuzzy InferenceSystems, Nonlinear Optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19847403 Multi-Disciplinary Optimisation Methodology for Aircraft Load Prediction
Authors: Sudhir Kumar Tiwari
Abstract:
The paper demonstrates a methodology that can be used at an early design stage of any conventional aircraft. This research activity assesses the feasibility derivation of methodology for aircraft loads estimation during the various phases of design for a transport category aircraft by utilizing potential of using commercial finite element analysis software, which may drive significant time saving. Early Design phase have limited data and quick changing configuration results in handling of large number of load cases. It is useful to idealize the aircraft as a connection of beams, which can be very accurately modelled using finite element analysis (beam elements). This research explores the correct approach towards idealizing an aircraft using beam elements. FEM Techniques like inertia relief were studied for implementation during course of work. The correct boundary condition technique envisaged for generation of shear force, bending moment and torque diagrams for the aircraft. The possible applications of this approach are the aircraft design process, which have been investigated.
Keywords: Multi-disciplinary optimization, aircraft load, finite element analysis, Stick Model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11307402 Lagrange and Multilevel Wavelet-Galerkin with Polynomial Time Basis for Heat Equation
Authors: Watcharakorn Thongchuay, Puntip Toghaw, Montri Maleewong
Abstract:
The Wavelet-Galerkin finite element method for solving the one-dimensional heat equation is presented in this work. Two types of basis functions which are the Lagrange and multi-level wavelet bases are employed to derive the full form of matrix system. We consider both linear and quadratic bases in the Galerkin method. Time derivative is approximated by polynomial time basis that provides easily extend the order of approximation in time space. Our numerical results show that the rate of convergences for the linear Lagrange and the linear wavelet bases are the same and in order 2 while the rate of convergences for the quadratic Lagrange and the quadratic wavelet bases are approximately in order 4. It also reveals that the wavelet basis provides an easy treatment to improve numerical resolutions that can be done by increasing just its desired levels in the multilevel construction process.Keywords: Galerkin finite element method, Heat equation , Lagrange basis function, Wavelet basis function.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17297401 Comparing the Quality of Service of Bus Companies Operating in two Cities in Brazil
Authors: D. I. De Souza, D. Kipper, G. P. Azevedo
Abstract:
The main objective of this work is to compare the quality of service of the bus companies operating in the city of Rio Branco, located in the state of Acre with the quality of service of the bus companies operating in the city of Campos, situated in the state of Rio de Janeiro, both cities in Brazil. This comparison, based on the opinion of the bus users, will determine their degree of satisfaction with the service available in both cities. The outcome of this evaluation shows the users unhappy with the quality of the service provided by the bus companies operating in both cities and the need to identify alternative solutions that may minimize the consequences caused by the main problems detected in this work. With these alternatives available, the bus companies will be able to better understand the needs of their customers in terms of manpower, service cost, time schedule, etc.Keywords: PubicTransportation, Quality of Service, Riders' Opinion, Bus Companies
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12907400 Improvement of Reaction Technology of Decalin Halogenation
Authors: Dmitriy Yu. Korulkin, Ravshan M. Nuraliev, Raissa A. Muzychkina
Abstract:
In this research paper were investigated the main regularities of a radical bromination reaction of decalin. There had been studied the temperature effect, durations of reaction, frequency rate of process, a ratio of initial components, type and number of the initiator on decalin bromination degree. There were specified optimum conditions of synthesis of a perbromodecalin by the method of a decalin bromination. There are developed the technological flowchart of receiving a perbromodecalin and the mass balance of process on the first and the subsequent loadings of components. The results of research of antibacterial and antifungal activity of synthesized bromoderivatives have been represented.
Keywords: Decalin, optimum technology, perbromodecalin, radical bromination.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22187399 Mathematical Modeling to Predict Surface Roughness in CNC Milling
Authors: Ab. Rashid M.F.F., Gan S.Y., Muhammad N.Y.
Abstract:
Surface roughness (Ra) is one of the most important requirements in machining process. In order to obtain better surface roughness, the proper setting of cutting parameters is crucial before the process take place. This research presents the development of mathematical model for surface roughness prediction before milling process in order to evaluate the fitness of machining parameters; spindle speed, feed rate and depth of cut. 84 samples were run in this study by using FANUC CNC Milling α-Τ14ιE. Those samples were randomly divided into two data sets- the training sets (m=60) and testing sets(m=24). ANOVA analysis showed that at least one of the population regression coefficients was not zero. Multiple Regression Method was used to determine the correlation between a criterion variable and a combination of predictor variables. It was established that the surface roughness is most influenced by the feed rate. By using Multiple Regression Method equation, the average percentage deviation of the testing set was 9.8% and 9.7% for training data set. This showed that the statistical model could predict the surface roughness with about 90.2% accuracy of the testing data set and 90.3% accuracy of the training data set.
Keywords: Surface roughness, regression analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21317398 A Study of the Variables in the Optimisation of a Platinum Precipitation Process
Authors: Tebogo Phetla, Edison Muzenda, M Belaid
Abstract:
This study investigated possible ways to improve the efficiency of the platinum precipitation process using ammonium chloride by reducing the platinum content reporting to the effluent. The ore treated consist of five platinum group metals namely, ruthenium, rhodium, iridium, platinum, palladium and a precious metal gold. Gold, ruthenium, rhodium and iridium were extracted prior the platinum precipitation process. Temperature, reducing agent, flow rate and potential difference were the variables controlled to determine the operation conditions for optimum platinum precipitation efficiency. Hydrogen peroxide was added as the oxidizing agent at the temperature of 85-90oC and potential difference of 700-850mV was the variable used to check the oxidizing state of platinum. The platinum was further purified at temperature between 60-65oC, potential difference above 700 mV, ammonium chloride of 200 l, and at these conditions the platinum content reporting to the effluent was reduced to less than 300ppm, resulting in optimum platinum precipitation efficiency and purity of 99.9%.Keywords: Platinum Group Metals (PGM), Potential difference, Precipitation, Redox reactions.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 47897397 Process Development of Safe and Ready-to-eat Raw Oyster Meat by Irradiation Technology
Authors: Pattama Ratana-Arporn, Pongtep Wilaipun
Abstract:
White scar oyster (Crassostrea belcheri) is often eaten raw and being the leading vehicle for foodborne disease, especially Salmonella Weltevreden which exposed the prominent and most resistant to radiation. Gamma irradiation at a low dose of 1 kGy was enough to eliminate S. Weltevreden contaminated in oyster meat at a level up to 5 log CFU/g while it still retain the raw characteristics and equivalent sensory quality as the non-irradiated one. Process development of ready-to-eat chilled oyster meat was conducted by shucking the meat, individually packed in plastic bags, subjected to 1 kGy gamma radiation at chilled condition and then stored in 4oC refrigerated temperature. Microbiological determination showed the absence of S. Weltevreden (5 log CFU/g initial inoculated) along the whole storage time of 30 days. Sensory evaluation indicated the decreasing in sensory scores along storage time which determining the product shelf life to be 18 days compared to 15 days of nonirradiated one. The most advantage of developed process was to provide the safe raw oyster to consumers and in addition sensory quality retained and 3-day extension shelf life also exist.Keywords: decontamination, food safety, irradiation, oyster, Salmonella Weltevreden
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16847396 Decision Support System for Flood Crisis Management using Artificial Neural Network
Authors: Muhammad Aqil, Ichiro Kita, Akira Yano, Nishiyama Soichi
Abstract:
This paper presents an alternate approach that uses artificial neural network to simulate the flood level dynamics in a river basin. The algorithm was developed in a decision support system environment in order to enable users to process the data. The decision support system is found to be useful due to its interactive nature, flexibility in approach and evolving graphical feature and can be adopted for any similar situation to predict the flood level. The main data processing includes the gauging station selection, input generation, lead-time selection/generation, and length of prediction. This program enables users to process the flood level data, to train/test the model using various inputs and to visualize results. The program code consists of a set of files, which can as well be modified to match other purposes. This program may also serve as a tool for real-time flood monitoring and process control. The running results indicate that the decision support system applied to the flood level seems to have reached encouraging results for the river basin under examination. The comparison of the model predictions with the observed data was satisfactory, where the model is able to forecast the flood level up to 5 hours in advance with reasonable prediction accuracy. Finally, this program may also serve as a tool for real-time flood monitoring and process control.Keywords: Decision Support System, Neural Network, Flood Level
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16267395 Simultaneous Saccharification and Fermentation(SSF) of Sugarcane Bagasse - Kinetics and Modeling
Authors: E.Sasikumar, T.Viruthagiri
Abstract:
Simultaneous Saccharification and Fermentation (SSF) of sugarcane bagasse by cellulase and Pachysolen tannophilus MTCC *1077 were investigated in the present study. Important process variables for ethanol production form pretreated bagasse were optimized using Response Surface Methodology (RSM) based on central composite design (CCD) experiments. A 23 five level CCD experiments with central and axial points was used to develop a statistical model for the optimization of process variables such as incubation temperature (25–45°) X1, pH (5.0–7.0) X2 and fermentation time (24–120 h) X3. Data obtained from RSM on ethanol production were subjected to the analysis of variance (ANOVA) and analyzed using a second order polynomial equation and contour plots were used to study the interactions among three relevant variables of the fermentation process. The fermentation experiments were carried out using an online monitored modular fermenter 2L capacity. The processing parameters setup for reaching a maximum response for ethanol production was obtained when applying the optimum values for temperature (32°C), pH (5.6) and fermentation time (110 h). Maximum ethanol concentration (3.36 g/l) was obtained from 50 g/l pretreated sugarcane bagasse at the optimized process conditions in aerobic batch fermentation. Kinetic models such as Monod, Modified Logistic model, Modified Logistic incorporated Leudeking – Piret model and Modified Logistic incorporated Modified Leudeking – Piret model have been evaluated and the constants were predicted.
Keywords: Sugarcane bagasse, ethanol, optimization, Pachysolen tannophilus.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23027394 Specialized Reduced Models of Dynamic Flows in 2-Stroke Engines
Authors: S. Cagin, X. Fischer, E. Delacourt, N. Bourabaa, C. Morin, D. Coutellier, B. Carré, S. Loumé
Abstract:
The complexity of scavenging by ports and its impact on engine efficiency create the need to understand and to model it as realistically as possible. However, there are few empirical scavenging models and these are highly specialized. In a design optimization process, they appear very restricted and their field of use is limited. This paper presents a comparison of two methods to establish and reduce a model of the scavenging process in 2-stroke diesel engines. To solve the lack of scavenging models, a CFD model has been developed and is used as the referent case. However, its large size requires a reduction. Two techniques have been tested depending on their fields of application: The NTF method and neural networks. They both appear highly appropriate drastically reducing the model’s size (over 90% reduction) with a low relative error rate (under 10%). Furthermore, each method produces a reduced model which can be used in distinct specialized fields of application: the distribution of a quantity (mass fraction for example) in the cylinder at each time step (pseudo-dynamic model) or the qualification of scavenging at the end of the process (pseudo-static model).
Keywords: Diesel engine, Design optimization, Model reduction, Neural network, NTF algorithm, Scavenging.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13307393 CFD Simulations to Examine Natural Ventilation of a Work Area in a Public Building
Authors: An-Shik Yang, Chiang-Ho Cheng, Jen-Hao Wu, Yu-Hsuan Juan
Abstract:
Natural ventilation has played an important role for many low energy-building designs. It has been also noticed as a essential subject to persistently bring the fresh cool air from the outside into a building. This study carried out the computational fluid dynamics (CFD)-based simulations to examine the natural ventilation development of a work area in a public building. The simulated results can be useful to better understand the indoor microclimate and the interaction of wind with buildings. Besides, this CFD simulation procedure can serve as an effective analysis tool to characterize the airing performance, and thereby optimize the building ventilation for strengthening the architects, planners and other decision makers on improving the natural ventilation design of public buildings.
Keywords: CFD simulations, Natural ventilation, Microclimate, Wind environment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 37527392 The Sign in the Communication Process
Authors: S. Pesina, T. Solonchak
Abstract:
In the process of information transmission (concept verbalization) we deal mostly with the substance (contents), and then pay attention to the form. Recalling events from the remote past, often we cannot exactly reproduce specific heard or pronounced words, as well as the syntactic structures. We remember events, feelings, images; we recall the general contents of the discourse. The thought gets a specific language form only during the concept verbalization phase. With minimum time for pondering, depending on the language competence level, the grammar and syntactic shaping often occurs automatically with the use of famous models and stereotypes. This means that the language form adapts itself to the consciousness, and not vice versa.
Keywords: Lexical eidos, phenomenology, noema, polysemantic word, semantic core.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19427391 Didactic Material Resources in the Teaching of National History and Geography: Selected Results of a Qualitative Survey
Authors: Martin Skutil, Klára Havlíčková, Renata Matějíčková
Abstract:
The paper is the first output of a larger research project conducted at the Faculty of Education of the University of Hradec Králové, which deals with an improved understanding of teachers' work in the subject of National History and Geography. Partial findings focusing on the use of didactic material resources in teaching are presented in this phase. With the regard to promotion of independent activity of students within learner based education, material equipment of schools with didactic aids is becoming increasingly important. This paper is based on qualitative research, where the possibilities and mainly the reasons for use of material didactic resources in teaching were investigated through semi-structured interviews. Attention was focused on ways of working with different teaching aids and their implementation in the educational process. It turns out that teachers accept current constructivist and humanistic approaches to education associated with the requirement to prepare students for life in an information society, and accordingly they adjust their teaching.
Keywords: Primary education, National History and Geography, didactic material resources, qualitative research.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 36527390 Library Aware Power Conscious Realization of Complementary Boolean Functions
Authors: Padmanabhan Balasubramanian, C. Ardil
Abstract:
In this paper, we consider the problem of logic simplification for a special class of logic functions, namely complementary Boolean functions (CBF), targeting low power implementation using static CMOS logic style. The functions are uniquely characterized by the presence of terms, where for a canonical binary 2-tuple, D(mj) ∪ D(mk) = { } and therefore, we have | D(mj) ∪ D(mk) | = 0 [19]. Similarly, D(Mj) ∪ D(Mk) = { } and hence | D(Mj) ∪ D(Mk) | = 0. Here, 'mk' and 'Mk' represent a minterm and maxterm respectively. We compare the circuits minimized with our proposed method with those corresponding to factored Reed-Muller (f-RM) form, factored Pseudo Kronecker Reed-Muller (f-PKRM) form, and factored Generalized Reed-Muller (f-GRM) form. We have opted for algebraic factorization of the Reed-Muller (RM) form and its different variants, using the factorization rules of [1], as it is simple and requires much less CPU execution time compared to Boolean factorization operations. This technique has enabled us to greatly reduce the literal count as well as the gate count needed for such RM realizations, which are generally prone to consuming more cells and subsequently more power consumption. However, this leads to a drawback in terms of the design-for-test attribute associated with the various RM forms. Though we still preserve the definition of those forms viz. realizing such functionality with only select types of logic gates (AND gate and XOR gate), the structural integrity of the logic levels is not preserved. This would consequently alter the testability properties of such circuits i.e. it may increase/decrease/maintain the same number of test input vectors needed for their exhaustive testability, subsequently affecting their generalized test vector computation. We do not consider the issue of design-for-testability here, but, instead focus on the power consumption of the final logic implementation, after realization with a conventional CMOS process technology (0.35 micron TSMC process). The quality of the resulting circuits evaluated on the basis of an established cost metric viz., power consumption, demonstrate average savings by 26.79% for the samples considered in this work, besides reduction in number of gates and input literals by 39.66% and 12.98% respectively, in comparison with other factored RM forms.
Keywords: Reed-Muller forms, Logic function, Hammingdistance, Algebraic factorization, Low power design.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18117389 Placement of Implants in Palatum of a Teenager without Maxillary Incisor Teeth
Authors: Luan Mavriqi, Ilma Robo, Emin Kuzimi, Egresa Baca
Abstract:
The process of skeletal growth in an adolescent significantly affects the displacement of implants placed in the palatine suture. The problems caused by this process have an impact on the dental function and aesthetics of the affected data. If fixed prostheses are placed based on implants, the whole structure would impede maxillary growth. This is the significant difference between the maxilla and the mandible, as the lower jaw has no growth process that affects the movement of the implants or the latter to inhibit the growth of the jaw. In a teenager patient an accident occurred accompanied by loss of maxillary central incisors. The main complaint of patients is aesthetics and phonetics. Dental history of patients refers to the presence of a Maryland bridge that was accompanied by dissatisfaction on the part of the patient. Implant placement is not indicated as jaw augmentation may lead to displacement of the implant. The treatment plan includes the placement of implants in the palatum where this bone thickness allows as a procedure.in this article only the first stage of treatment is presented. Implant treatment is ongoing, will be followed by the second phase of treatment when the patient has reached the age of 18 years.
Keywords: Implants, palatum, adolescent, primary incisor teeth.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 394