Search results for: wasteless method of ores processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 21640

Search results for: wasteless method of ores processing

20140 Comparison of Cyclone Design Methods for Removal of Fine Particles from Plasma Generated Syngas

Authors: Mareli Hattingh, I. Jaco Van der Walt, Frans B. Waanders

Abstract:

A waste-to-energy plasma system was designed by Necsa for commercial use to create electricity from unsorted municipal waste. Fly ash particles must be removed from the syngas stream at operating temperatures of 1000 °C and recycled back into the reactor for complete combustion. A 2D2D high efficiency cyclone separator was chosen for this purpose. During this study, two cyclone design methods were explored: The Classic Empirical Method (smaller cyclone) and the Flow Characteristics Method (larger cyclone). These designs were optimized with regard to efficiency, so as to remove at minimum 90% of the fly ash particles of average size 10 μm by 50 μm. Wood was used as feed source at a concentration of 20 g/m3 syngas. The two designs were then compared at room temperature, using Perspex test units and three feed gases of different densities, namely nitrogen, helium and air. System conditions were imitated by adapting the gas feed velocity and particle load for each gas respectively. Helium, the least dense of the three gases, would simulate higher temperatures, whereas air, the densest gas, simulates a lower temperature. The average cyclone efficiencies ranged between 94.96% and 98.37%, reaching up to 99.89% in individual runs. The lowest efficiency attained was 94.00%. Furthermore, the design of the smaller cyclone proved to be more robust, while the larger cyclone demonstrated a stronger correlation between its separation efficiency and the feed temperatures. The larger cyclone can be assumed to achieve slightly higher efficiencies at elevated temperatures. However, both design methods led to good designs. At room temperature, the difference in efficiency between the two cyclones was almost negligible. At higher temperatures, however, these general tendencies are expected to be amplified so that the difference between the two design methods will become more obvious. Though the design specifications were met for both designs, the smaller cyclone is recommended as default particle separator for the plasma system due to its robust nature.

Keywords: Cyclone, design, plasma, renewable energy, solid separation, waste processing

Procedia PDF Downloads 211
20139 Roasting Degree of Cocoa Beans by Artificial Neural Network (ANN) Based Electronic Nose System and Gas Chromatography (GC)

Authors: Juzhong Tan, William Kerr

Abstract:

Roasting is one critical procedure in chocolate processing, where special favors are developed, moisture content is decreased, and better processing properties are developed. Therefore, determination of roasting degree of cocoa bean is important for chocolate manufacturers to ensure the quality of chocolate products, and it also decides the commercial value of cocoa beans collected from cocoa farmers. The roasting degree of cocoa beans currently relies on human specialists, who sometimes are biased, and chemical analysis, which take long time and are inaccessible to many manufacturers and farmers. In this study, a self-made electronic nose system consists of gas sensors (TGS 800 and 2000 series) was used to detecting the gas generated by cocoa beans with a different roasting degree (0min, 20min, 30min, and 40min) and the signals collected by gas sensors were used to train a three-layers ANN. Chemical analysis of the graded beans was operated by traditional GC-MS system and the contents of volatile chemical compounds were used to train another ANN as a reference to electronic nosed signals trained ANN. Both trained ANN were used to predict cocoa beans with a different roasting degree for validation. The best accuracy of grading achieved by electronic nose signals trained ANN (using signals from TGS 813 826 820 880 830 2620 2602 2610) turned out to be 96.7%, however, the GC trained ANN got the accuracy of 83.8%.

Keywords: artificial neutron network, cocoa bean, electronic nose, roasting

Procedia PDF Downloads 232
20138 Fault Location Identification in High Voltage Transmission Lines

Authors: Khaled M. El Naggar

Abstract:

This paper introduces a digital method for fault section identification in transmission lines. The method uses digital set of the measured short circuit current to locate faults in electrical power systems. The digitized current is used to construct a set of overdetermined system of equations. The problem is then constructed and solved using the proposed digital optimization technique to find the fault distance. The proposed optimization methodology is an application of simulated annealing optimization technique. The method is tested using practical case study to evaluate the proposed method. The accurate results obtained show that the algorithm can be used as a powerful tool in the area of power system protection.

Keywords: optimization, estimation, faults, measurement, high voltage, simulated annealing

Procedia PDF Downloads 391
20137 A Versatile Data Processing Package for Ground-Based Synthetic Aperture Radar Deformation Monitoring

Authors: Zheng Wang, Zhenhong Li, Jon Mills

Abstract:

Ground-based synthetic aperture radar (GBSAR) represents a powerful remote sensing tool for deformation monitoring towards various geohazards, e.g. landslides, mudflows, avalanches, infrastructure failures, and the subsidence of residential areas. Unlike spaceborne SAR with a fixed revisit period, GBSAR data can be acquired with an adjustable temporal resolution through either continuous or discontinuous operation. However, challenges arise from processing high temporal-resolution continuous GBSAR data, including the extreme cost of computational random-access-memory (RAM), the delay of displacement maps, and the loss of temporal evolution. Moreover, repositioning errors between discontinuous campaigns impede the accurate measurement of surface displacements. Therefore, a versatile package with two complete chains is developed in this study in order to process both continuous and discontinuous GBSAR data and address the aforementioned issues. The first chain is based on a small-baseline subset concept and it processes continuous GBSAR images unit by unit. Images within a window form a basic unit. By taking this strategy, the RAM requirement is reduced to only one unit of images and the chain can theoretically process an infinite number of images. The evolution of surface displacements can be detected as it keeps temporarily-coherent pixels which are present only in some certain units but not in the whole observation period. The chain supports real-time processing of the continuous data and the delay of creating displacement maps can be shortened without waiting for the entire dataset. The other chain aims to measure deformation between discontinuous campaigns. Temporal averaging is carried out on a stack of images in a single campaign in order to improve the signal-to-noise ratio of discontinuous data and minimise the loss of coherence. The temporal-averaged images are then processed by a particular interferometry procedure integrated with advanced interferometric SAR algorithms such as robust coherence estimation, non-local filtering, and selection of partially-coherent pixels. Experiments are conducted using both synthetic and real-world GBSAR data. Displacement time series at the level of a few sub-millimetres are achieved in several applications (e.g. a coastal cliff, a sand dune, a bridge, and a residential area), indicating the feasibility of the developed GBSAR data processing package for deformation monitoring of a wide range of scientific and practical applications.

Keywords: ground-based synthetic aperture radar, interferometry, small baseline subset algorithm, deformation monitoring

Procedia PDF Downloads 159
20136 Robust Adaptation to Background Noise in Multichannel C-OTDR Monitoring Systems

Authors: Andrey V. Timofeev, Viktor M. Denisov

Abstract:

A robust sequential nonparametric method is proposed for adaptation to background noise parameters for real-time. The distribution of background noise was modelled like to Huber contamination mixture. The method is designed to operate as an adaptation-unit, which is included inside a detection subsystem of an integrated multichannel monitoring system. The proposed method guarantees the given size of a nonasymptotic confidence set for noise parameters. Properties of the suggested method are rigorously proved. The proposed algorithm has been successfully tested in real conditions of a functioning C-OTDR monitoring system, which was designed to monitor railways.

Keywords: guaranteed estimation, multichannel monitoring systems, non-asymptotic confidence set, contamination mixture

Procedia PDF Downloads 428
20135 Global Mittag-Leffler Stability of Fractional-Order Bidirectional Associative Memory Neural Network with Discrete and Distributed Transmission Delays

Authors: Swati Tyagi, Syed Abbas

Abstract:

Fractional-order Hopfield neural networks are generally used to model the information processing among the interacting neurons. To show the constancy of the processed information, it is required to analyze the stability of these systems. In this work, we perform Mittag-Leffler stability for the corresponding Caputo fractional-order bidirectional associative memory (BAM) neural networks with various time-delays. We derive sufficient conditions to ensure the existence and uniqueness of the equilibrium point by using the theory of topological degree theory. By applying the fractional Lyapunov method and Mittag-Leffler functions, we derive sufficient conditions for the global Mittag-Leffler stability, which further imply the global asymptotic stability of the network equilibrium. Finally, we present two suitable examples to show the effectiveness of the obtained results.

Keywords: bidirectional associative memory neural network, existence and uniqueness, fractional-order, Lyapunov function, Mittag-Leffler stability

Procedia PDF Downloads 362
20134 On the Bootstrap P-Value Method in Identifying out of Control Signals in Multivariate Control Chart

Authors: O. Ikpotokin

Abstract:

In any production process, every product is aimed to attain a certain standard, but the presence of assignable cause of variability affects our process, thereby leading to low quality of product. The ability to identify and remove this type of variability reduces its overall effect, thereby improving the quality of the product. In case of a univariate control chart signal, it is easy to detect the problem and give a solution since it is related to a single quality characteristic. However, the problems involved in the use of multivariate control chart are the violation of multivariate normal assumption and the difficulty in identifying the quality characteristic(s) that resulted in the out of control signals. The purpose of this paper is to examine the use of non-parametric control chart (the bootstrap approach) for obtaining control limit to overcome the problem of multivariate distributional assumption and the p-value method for detecting out of control signals. Results from a performance study show that the proposed bootstrap method enables the setting of control limit that can enhance the detection of out of control signals when compared, while the p-value method also enhanced in identifying out of control variables.

Keywords: bootstrap control limit, p-value method, out-of-control signals, p-value, quality characteristics

Procedia PDF Downloads 345
20133 Numerical Method for Heat Transfer Problem in a Block Having an Interface

Authors: Beghdadi Lotfi, Bouziane Abdelhafid

Abstract:

A finite volume method for quadrilaterals unstructured mesh is developed to predict the two dimensional steady-state solutions of conduction equation. In this scheme, based on the integration around the polygonal control volume, the derivatives of conduction equation must be converted into closed line integrals using same formulation of the Stokes theorem. To valid the accuracy of the method two numerical experiments s are used: conduction in a regular block (with known analytical solution) and conduction in a rotated block (case with curved boundaries).The numerical results show good agreement with analytical results. To demonstrate the accuracy of the method, the absolute and root-mean square errors versus the grid size are examined quantitatively.

Keywords: Stokes theorem, unstructured grid, heat transfer, complex geometry

Procedia PDF Downloads 288
20132 Method of Visual Prosthesis Design Based on Biologically Inspired Design

Authors: Shen Jian, Hu Jie, Zhu Guo Niu, Peng Ying Hong

Abstract:

There are two issues exited in the traditional visual prosthesis: lacking systematic method and the low level of humanization. To tackcle those obstacles, a visual prosthesis design method based on biologically inspired design is proposed. Firstly, a constrained FBS knowledge cell model is applied to construct the functional model of visual prosthesis in biological field. Then the clustering results of engineering domain are ob-tained with the use of the cross-domain knowledge cell clustering algorithm. Finally, a prototype system is designed to support the bio-logically inspired design where the conflict is digested by TRIZ and other tools, and the validity of the method is verified by the solution scheme

Keywords: knowledge-based engineering, visual prosthesis, biologically inspired design, biomedical engineering

Procedia PDF Downloads 189
20131 Method Validation for Determining Platinum and Palladium in Catalysts Using Inductively Coupled Plasma Optical Emission Spectrometry

Authors: Marin Senila, Oana Cadar, Thorsten Janisch, Patrick Lacroix-Desmazes

Abstract:

The study presents the analytical capability and validation of a method based on microwave-assisted acid digestion for quantitative determination of platinum and palladium in catalysts using inductively coupled plasma optical emission spectrometry (ICP-OES). In order to validate the method, the main figures of merit such as limit of detection and limit of quantification, precision and accuracy were considered and the measurement uncertainty was estimated based on the bottom-up approach according to the international guidelines of ISO/IEC 17025. Limit of detections, estimated from blank signal using 3 s criterion, were 3.0 mg/kg for Pt and respectively 3.6 mg/kg for Pd, while limits of quantification were 9.0 mg/kg for Pt and respectively 10.8 mg/kg for Pd. Precisions, evaluated as standard deviations of repeatability (n=5 parallel samples), were less than 10% for both precious metals. Accuracies of the method, verified by recovery estimation certified reference material NIST SRM 2557 - pulverized recycled monolith, were 99.4 % for Pt and 101% for Pd. The obtained limit of quantifications and accuracy were satisfactory for the intended purpose. The paper offers all the steps necessary to validate the determination method for Pt and Pd in catalysts using inductively coupled plasma optical emission spectrometry.

Keywords: catalyst analysis, ICP-OES, method validation, platinum, palladium

Procedia PDF Downloads 165
20130 Preservation of Coconut Toddy Sediments as a Leavening Agent for Bakery Products

Authors: B. R. Madushan, S. B. Navaratne, I. Wickramasinge

Abstract:

Toddy sediment (TS) was cultured in a PDA medium to determine initial yeast load, and also it was undergone sun, shade, solar, dehumidified cold air (DCA) and hot air oven (at 400, 500 and 60oC) drying with a view to preserve viability of yeast. Thereafter, this study was conducted according to two factor factorial design in order to determine best preservation method. Therein the dried TS from the best drying method was taken and divided into two portions. One portion was mixed with 3: 7 ratio of TS: rice flour and the mixture was divided in to two again. While one portion was kept under in house condition the other was in a refrigerator. Same procedure was followed to the rest portion of TS too but it was at the same ratio of corn flour. All treatments were vacuum packed in triple laminate pouches and the best preservation method was determined in terms of leavening index (LI). The TS obtained from the best preservation method was used to make foods (bread and hopper) and organoleptic properties of it were evaluated against same of ordinary foods using sensory panel with a five point hedonic scale. Results revealed that yeast load or fresh TS was 58×106 CFU/g. The best drying method in preserving viability of yeast was DCA because LI of this treatment (96%) is higher than that of other three treatments. Organoleptic properties of foods prepared from best preservation method are as same as ordinary foods according to Duo trio test.

Keywords: biological leavening agent, coconut toddy, fermentation, yeast

Procedia PDF Downloads 340
20129 Exploration into Bio Inspired Computing Based on Spintronic Energy Efficiency Principles and Neuromorphic Speed Pathways

Authors: Anirudh Lahiri

Abstract:

Neuromorphic computing, inspired by the intricate operations of biological neural networks, offers a revolutionary approach to overcoming the limitations of traditional computing architectures. This research proposes the integration of spintronics with neuromorphic systems, aiming to enhance computational performance, scalability, and energy efficiency. Traditional computing systems, based on the Von Neumann architecture, struggle with scalability and efficiency due to the segregation of memory and processing functions. In contrast, the human brain exemplifies high efficiency and adaptability, processing vast amounts of information with minimal energy consumption. This project explores the use of spintronics, which utilizes the electron's spin rather than its charge, to create more energy-efficient computing systems. Spintronic devices, such as magnetic tunnel junctions (MTJs) manipulated through spin-transfer torque (STT) and spin-orbit torque (SOT), offer a promising pathway to reducing power consumption and enhancing the speed of data processing. The integration of these devices within a neuromorphic framework aims to replicate the efficiency and adaptability of biological systems. The research is structured into three phases: an exhaustive literature review to build a theoretical foundation, laboratory experiments to test and optimize the theoretical models, and iterative refinements based on experimental results to finalize the system. The initial phase focuses on understanding the current state of neuromorphic and spintronic technologies. The second phase involves practical experimentation with spintronic devices and the development of neuromorphic systems that mimic synaptic plasticity and other biological processes. The final phase focuses on refining the systems based on feedback from the testing phase and preparing the findings for publication. The expected contributions of this research are twofold. Firstly, it aims to significantly reduce the energy consumption of computational systems while maintaining or increasing processing speed, addressing a critical need in the field of computing. Secondly, it seeks to enhance the learning capabilities of neuromorphic systems, allowing them to adapt more dynamically to changing environmental inputs, thus better mimicking the human brain's functionality. The integration of spintronics with neuromorphic computing could revolutionize how computational systems are designed, making them more efficient, faster, and more adaptable. This research aligns with the ongoing pursuit of energy-efficient and scalable computing solutions, marking a significant step forward in the field of computational technology.

Keywords: material science, biological engineering, mechanical engineering, neuromorphic computing, spintronics, energy efficiency, computational scalability, synaptic plasticity.

Procedia PDF Downloads 41
20128 The Application of the Analytic Basis Function Expansion Triangular-z Nodal Method for Neutron Diffusion Calculation

Authors: Kunpeng Wang, Hongchun, Wu, Liangzhi Cao, Chuanqi Zhao

Abstract:

The distributions of homogeneous neutron flux within a node were expanded into a set of analytic basis functions which satisfy the diffusion equation at any point in a triangular-z node for each energy group, and nodes were coupled with each other with both the zero- and first-order partial neutron current moments across all the interfaces of the triangular prism at the same time. Based this method, a code TABFEN has been developed and applied to solve the neutron diffusion equation in a complicated geometry. In addition, after a series of numerical derivation, one can get the neutron adjoint diffusion equations in matrix form which is the same with the neutron diffusion equation; therefore, it can be solved by TABFEN, and the low-high scan strategy is adopted to improve the efficiency. Four benchmark problems are tested by this method to verify its feasibility, the results show good agreement with the references which demonstrates the efficiency and feasibility of this method.

Keywords: analytic basis function expansion method, arbitrary triangular-z node, adjoint neutron flux, complicated geometry

Procedia PDF Downloads 444
20127 Shoring System Selection for Deep Excavation

Authors: Faouzi Ahtchi-Ali, Marcus Vitiello

Abstract:

A study was conducted in the east region of the Middle East to assess the constructability of a shoring system for a 12-meter deep excavation. Several shoring systems were considered in this study including secant concrete piling, contiguous concrete piling, and sheet-piling. The excavation was carried out in a very dense sand with the groundwater level located at 3 meters below ground surface. The study included conducting a pilot test for each shoring system listed above. The secant concrete piling included overlapping concrete piles to a depth of 16 meters. Drilling method with full steel casing was utilized to install the concrete piles. The verticality of the piles was a concern for the overlap. The contiguous concrete piling required the installation of micro-piles to seal the gap between the concrete piles. This method revealed that the gap between the piles was not fully sealed as observed by the groundwater penetration to the excavation. The sheet-piling method required pre-drilling due to the high blow count of the penetrated layer of saturated sand. This study concluded that the sheet-piling method with pre-drilling was the most cost effective and recommended a method for the shoring system.

Keywords: excavation, shoring system, middle east, Drilling method

Procedia PDF Downloads 467
20126 Structural and Optical Properties of Silver Sulfide/Reduced Graphene Oxide Nanocomposite

Authors: Oyugi Ngure Robert, Kallen Mulilo Nalyanya, Tabitha A. Amollo

Abstract:

Nanomaterials have attracted significant attention in research because of their exemplary properties, making them suitable for diverse applications. This paper reports the successful synthesis as well as the structural properties of silver sulfide/reduced graphene oxide (Ag_2 S-rGO) nanocomposite. The nanocomposite was synthesized by the chemical reduction method. Scanning electron microscopy (SEM) showed that the reduced graphene oxide (rGO) sheets were intercalated within the Ag_2 S nanoparticles during the chemical reduction process. The SEM images also showed that Ag_2 S had the shape of nanowires. Further, SEM energy dispersive X-ray (SEM EDX) showed that Ag_2 S-rGO is mainly composed of C, Ag, O, and S. X-ray diffraction analysis manifested a high crystallinity for the nanowire-shaped Ag2S nanoparticles with a d-spacing ranging between 1.0 Å and 5.2 Å. Thermal gravimetric analysis (TGA) showed that rGO enhances the thermal stability of the nanocomposite. Ag_2 S-rGO nanocomposite exhibited strong optical absorption in the UV region. The formed nanocomposite is dispersible in polar and non-polar solvents, qualifying it for solution-based device processing.

Keywords: silver sulfide, reduced graphene oxide, nanocomposite, structural properties, optical properties

Procedia PDF Downloads 94
20125 Low Power Glitch Free Dual Output Coarse Digitally Controlled Delay Lines

Authors: K. Shaji Mon, P. R. John Sreenidhi

Abstract:

In deep-submicrometer CMOS processes, time-domain resolution of a digital signal is becoming higher than voltage resolution of analog signals. This claim is nowadays pushing toward a new circuit design paradigm in which the traditional analog signal processing is expected to be progressively substituted by the processing of times in the digital domain. Within this novel paradigm, digitally controlled delay lines (DCDL) should play the role of digital-to-analog converters in traditional, analog-intensive, circuits. Digital delay locked loops are highly prevalent in integrated systems.The proposed paper addresses the glitches present in delay circuits along with area,power dissipation and signal integrity.The digitally controlled delay lines(DCDL) under study have been designed in a 90 nm CMOS technology 6 layer metal Copper Strained SiGe Low K Dielectric. Simulation and synthesis results show that the novel circuits exhibit no glitches for dual output coarse DCDL with less power dissipation and consumes less area compared to the glitch free NAND based DCDL.

Keywords: glitch free, NAND-based DCDL, CMOS, deep-submicrometer

Procedia PDF Downloads 244
20124 WiFi Data Offloading: Bundling Method in a Canvas Business Model

Authors: Majid Mokhtarnia, Alireza Amini

Abstract:

Mobile operators deal with increasing in the data traffic as a critical issue. As a result, a vital responsibility of the operators is to deal with such a trend in order to create added values. This paper addresses a bundling method in a Canvas business model in a WiFi Data Offloading (WDO) strategy by which some elements of the model may be affected. In the proposed method, it is supposed to sell a number of data packages for subscribers in which there are some packages with a free given volume of data-offloaded WiFi complimentary. The paper on hands analyses this method in the views of attractiveness and profitability. The results demonstrate that the quality of implementation of the WDO strongly affects the final result and helps the decision maker to make the best one.

Keywords: bundling, canvas business model, telecommunication, WiFi data offloading

Procedia PDF Downloads 198
20123 Research of Applicable Ground Reinforcement Method in Double-Deck Tunnel Junction

Authors: SKhan Park, Seok Jin Lee, Jong Sun Kim, Jun Ho Lee, Bong Chan Kim

Abstract:

Because of the large economic losses caused by traffic congestion in metropolitan areas, various studies on the underground network design and construction techniques has been performed various studies in the developed countries. In Korea, it has performed a study to develop a versatile double-deck of deep tunnel model. This paper is an introduction to develop a ground reinforcement method to enable the safe tunnel construction in the weakened pillar section like as junction of tunnel. Applicable ground reinforcement method in the weakened section is proposed and it is expected to verify the method by the field application tests.

Keywords: double-deck tunnel, ground reinforcement, tunnel construction, weakened pillar section

Procedia PDF Downloads 406
20122 Prediction of the Thermal Parameters of a High-Temperature Metallurgical Reactor Using Inverse Heat Transfer

Authors: Mohamed Hafid, Marcel Lacroix

Abstract:

This study presents an inverse analysis for predicting the thermal conductivities and the heat flux of a high-temperature metallurgical reactor simultaneously. Once these thermal parameters are predicted, the time-varying thickness of the protective phase-change bank that covers the inside surface of the brick walls of a metallurgical reactor can be calculated. The enthalpy method is used to solve the melting/solidification process of the protective bank. The inverse model rests on the Levenberg-Marquardt Method (LMM) combined with the Broyden method (BM). A statistical analysis for the thermal parameter estimation is carried out. The effect of the position of the temperature sensors, total number of measurements and measurement noise on the accuracy of inverse predictions is investigated. Recommendations are made concerning the location of temperature sensors.

Keywords: inverse heat transfer, phase change, metallurgical reactor, Levenberg–Marquardt method, Broyden method, bank thickness

Procedia PDF Downloads 332
20121 Effect of Blanching and Drying Methods on the Degradation Kinetics and Color Stability of Radish (Raphanus sativus) Leaves

Authors: K. Radha Krishnan, Mirajul Alom

Abstract:

Dehydrated powder prepared from fresh radish (Raphanus sativus) leaves were investigated for the color stability by different drying methods (tray, sun and solar). The effect of blanching conditions, drying methods as well as drying temperatures (50 – 90°C) were considered for studying the color degradation kinetics of chlorophyll in the dehydrated powder. The hunter color parameters (L*, a*, b*) and total color difference (TCD) were determined in order to investigate the color degradation kinetics of chlorophyll. Blanching conditions, drying method and drying temperature influenced the changes in L*, a*, b* and TCD values. The changes in color values during processing were described by a first order kinetic model. The temperature dependence of chlorophyll degradation was adequately modeled by Arrhenius equation. To predict the losses in green color, a mathematical model was developed from the steady state kinetic parameters. The results from this study indicated the protective effect of blanching conditions on the color stability of dehydrated radish powder.

Keywords: chlorophyll, color stability, degradation kinetics, drying

Procedia PDF Downloads 398
20120 Practice of Mutual Squiggle Story Making as a Variant of Squiggle Method

Authors: Toshiki Ito

Abstract:

Mutual squiggle story making (MSSM ) is the development of Winnicott’s squiggle method in Japan. In the MSSM Method, a therapist has the client freely divide a piece of drawing paper into six spaces, and both the therapist and client do squiggle in each space. All six pictures finished, the therapist then asks the client to create a story using all the pictures. Making a story has the effect of reintegrating what is projected by consciousness. In this paper, the author presented a case with a junior high school girl using MSSM. And it is considered that the advantage of this technique is that (1) it enables non-verbal communication with children and adults who cannot express their feelings verbally. (2) Through this communication, the psychological content of the client and the characteristics of the client's mind can be understood, and (3) It can be said that mutual rapport is deepened by the supportive reaction of the therapist.

Keywords: MSSM, squiggle, Winnicott, drawing method

Procedia PDF Downloads 198
20119 A Study of Adaptive Fault Detection Method for GNSS Applications

Authors: Je Young Lee, Hee Sung Kim, Kwang Ho Choi, Joonhoo Lim, Sebum Chun, Hyung Keun Lee

Abstract:

A purpose of this study is to develop efficient detection method for Global Navigation Satellite Systems (GNSS) applications based on adaptive estimation. Due to dependence of radio frequency signals, GNSS measurements are dominated by systematic errors in receiver’s operating environment. Thus, to utilize GNSS for aerospace or ground vehicles requiring high level of safety, unhealthy measurements should be considered seriously. For the reason, this paper proposes adaptive fault detection method to deal with unhealthy measurements in various harsh environments. By the proposed method, the test statistics for fault detection is generated by estimated measurement noise. Pseudorange and carrier-phase measurement noise are obtained at time propagations and measurement updates in process of Carrier-Smoothed Code (CSC) filtering, respectively. Performance of the proposed method was evaluated by field-collected GNSS measurements. To evaluate the fault detection capability, intentional faults were added to measurements. The experimental result shows that the proposed detection method is efficient in detecting unhealthy measurements and improves the accuracy of GNSS positioning under fault occurrence.

Keywords: adaptive estimation, fault detection, GNSS, residual

Procedia PDF Downloads 571
20118 Vocational Teaching Method: A Conceptual Model in Teaching Automotive Practical Work

Authors: Adnan Ahmad, Yusri Kamin, Asnol Dahar Minghat, Mohd. Khir Nordin, Dayana Farzeha, Ahmad Nabil

Abstract:

The purpose of this study is to identify the teaching method practices of the practical work subject in Vocational Secondary School. This study examined the practice of Vocational Teaching Method in Automotive Practical Work. The quantitative method used the sets of the questionnaire. 283 students and 63 teachers involved from ten VSS involved in this research. Research finding showed in conducting the introduction session teachers prefer used the demonstration method and questioning technique. While in deliver the content of practical task, teachers applied group monitoring and problem-solving approach. To conclude the task of automotive practical work, teachers choose re-explain and report writing to make sure students really understand all the process of teaching. VTM-APW also involved the competency-based concept to embed in the model. Derived from factors investigated, research produced the combination of elements in teaching skills and vocational skills which could be used as the best teaching method in automotive practical work for school level. As conclusion this study has concluded that the VTM-APW model is able to apply in teaching to make an improvement with current practices in Vocational Secondary School. Hence, teachers are suggested to use this method to enhance student's knowledge in Automotive and teachers will deliver skills to the current and future workforce relevant with the required competency skilled in workplace.

Keywords: vocational teaching method, practical task, teacher preferences, student preferences

Procedia PDF Downloads 451
20117 Finite Element Method as a Solution Procedure for Problems in Tissue Biomechanics

Authors: Momoh Omeiza Sheidu

Abstract:

Finite element method as a method of providing solutions to problems in computational bio mechanics provides a framework for modeling the function of tissues that integrates structurally from cell to organ system and functionally across the physiological processes that affect tissue mechanics or are regulated by mechanical forces. In this paper, we present an integrative finite element strategy for solution to problems in tissue bio mechanics as a case study.

Keywords: finite element, biomechanics, modeling, computational biomechanics

Procedia PDF Downloads 501
20116 The Harada Method: A Method for Employee Development during Production Ramp Up

Authors: M. Goerke, J. Gehrmann

Abstract:

Caused by shorter product life cycles and higher product variety the importance of production ramp ups is increasing. Even though companies are aware of that fact, up to 40% of the ramp up projects still miss technical and economical requirements. The success of a ramp up depends on the planning of human factors, organizational aspects and technological solutions. Since only partly considered in scientific literature, this paper lays its focus on the human factor during production ramp up. There are only incoherent methods which address the problems in this area. A systematic and holistic method to improve the capabilities of the employees during ramp up is missing. The Harada Method is a relatively young approach for developing highly-skilled workers. It consists of different worksheets which help employees to set guidelines and reach overall objectives. This approach is going to be transferred into a tool for ramp up management.

Keywords: employee development, Harada, production ramp up, organizational aspects

Procedia PDF Downloads 456
20115 Mapping of Urban Green Spaces Towards a Balanced Planning in a Coastal Landscape

Authors: Rania Ajmi, Faiza Allouche Khebour, Aude Nuscia Taibi, Sirine Essasi

Abstract:

Urban green spaces (UGS) as an important contributor can be a significant part of sustainable development. A spatial method was employed to assess and map the spatial distribution of UGS in five districts in Sousse, Tunisia. Ecological management of UGS is an essential factor for the sustainable development of the city; hence the municipality of Sousse has decided to support the districts according to different green spaces characters. And to implement this policy, (1) a new GIS web application was developed, (2) then the implementation of the various green spaces was carried out, (3) a spatial mapping of UGS using Quantum GIS was realized, and (4) finally a data processing and statistical analysis with RStudio programming language was executed. The intersection of the results of the spatial and statistical analyzes highlighted the presence of an imbalance in terms of the spatial UGS distribution in the study area. The discontinuity between the coast and the city's green spaces was not designed in a spirit of network and connection, hence the lack of a greenway that connects these spaces to the city. Finally, this GIS support will be used to assess and monitor green spaces in the city of Sousse by decision-makers and will contribute to improve the well-being of the local population.

Keywords: distributions, GIS, green space, imbalance, spatial analysis

Procedia PDF Downloads 201
20114 Conduction Accompanied With Transient Radiative Heat Transfer Using Finite Volume Method

Authors: A. Ashok, K.Satapathy, B. Prerana Nashine

Abstract:

The objective of this research work is to investigate for one dimensional transient radiative transfer equations with conduction using finite volume method. Within the infrastructure of finite-volume, we obtain the conservative discretization of the terms in order to preserve the overall conservative property of finitevolume schemes. Coupling of conductive and radiative equation resulting in fluxes is governed by the magnitude of emissivity, extinction coefficient, and temperature of the medium as well as geometry of the problem. The problem under consideration has been solved, for a slab dominating radiation coupled with transient conduction based on finite volume method. The boundary conditions are also chosen so as to give a good model of the discretized form of radiation transfer equation. The important feature of the present method is flexibility in specifying the control angles in the FVM, while keeping the simplicity in the solution procedure. Effects of various model parameters are examined on the distributions of temperature, radiative and conductive heat fluxes and incident radiation energy etc. The finite volume method is considered to effectively evaluate the propagation of radiation intensity through a participating medium.

Keywords: participating media, finite volume method, radiation coupled with conduction, transient radiative heat transfer

Procedia PDF Downloads 387
20113 Scientific Linux Cluster for BIG-DATA Analysis (SLBD): A Case of Fayoum University

Authors: Hassan S. Hussein, Rania A. Abul Seoud, Amr M. Refaat

Abstract:

Scientific researchers face in the analysis of very large data sets that is increasing noticeable rate in today’s and tomorrow’s technologies. Hadoop and Spark are types of software that developed frameworks. Hadoop framework is suitable for many Different hardware platforms. In this research, a scientific Linux cluster for Big Data analysis (SLBD) is presented. SLBD runs open source software with large computational capacity and high performance cluster infrastructure. SLBD composed of one cluster contains identical, commodity-grade computers interconnected via a small LAN. SLBD consists of a fast switch and Gigabit-Ethernet card which connect four (nodes). Cloudera Manager is used to configure and manage an Apache Hadoop stack. Hadoop is a framework allows storing and processing big data across the cluster by using MapReduce algorithm. MapReduce algorithm divides the task into smaller tasks which to be assigned to the network nodes. Algorithm then collects the results and form the final result dataset. SLBD clustering system allows fast and efficient processing of large amount of data resulting from different applications. SLBD also provides high performance, high throughput, high availability, expandability and cluster scalability.

Keywords: big data platforms, cloudera manager, Hadoop, MapReduce

Procedia PDF Downloads 357
20112 Forecasting Etching Behavior Silica Sand Using the Design of Experiments Method

Authors: Kefaifi Aissa, Sahraoui Tahar, Kheloufi Abdelkrim, Anas Sabiha, Hannane Farouk

Abstract:

The aim of this study is to show how the Design of Experiments Method (DOE) can be put into use as a practical approach for silica sand etching behavior modeling during its primary step of leaching. In the present work, we have studied etching effect on particle size during a primary step of leaching process on Algerian silica sand with florid acid (HF) at 20% and 30 % during 4 and 8 hours. Therefore, a new purity of the sand is noted depending on the time of leaching. This study was expanded by a numerical approach using a method of experiment design, which shows the influence of each parameter and the interaction between them in the process and approved the obtained experimental results. This model is a predictive approach using hide software. Based on the measured parameters experimentally in the interior of the model, the use of DOE method can make it possible to predict the outside parameters of the model in question and can give us the optimize response without making the experimental measurement.

Keywords: acid leaching, design of experiments method(DOE), purity silica, silica etching

Procedia PDF Downloads 284
20111 Parameters of Validation Method of Determining Polycyclic Aromatic Hydrocarbons in Drinking Water by High Performance Liquid Chromatography

Authors: Jonida Canaj

Abstract:

A simple method of extraction and determination of fifteen priority polycyclic aromatic hydrocarbons (PAHs) from drinking water using high performance liquid chromatography (HPLC) has been validated with limits of detection (LOD) and limits of quantification (LOQ), method recovery and reproducibility, and other factors. HPLC parameters, such as mobile phase composition and flow standardized for determination of PAHs using fluorescent detector (FLD). PAH was carried out by liquid-liquid extraction using dichloromethane. Linearity of calibration curves was good for all PAH (R², 0.9954-1.0000) in the concentration range 0.1-100 ppb. Analysis of standard spiked water samples resulted in good recoveries between 78.5-150%(0.1ppb) and 93.04-137.47% (10ppb). The estimated LOD and LOQ ranged between 0.0018-0.98 ppb. The method described has been used for determination of the fifteen PAHs contents in drinking water samples.

Keywords: high performance liquid chromatography, HPLC, method validation, polycyclic aromatic hydrocarbons, PAHs, water

Procedia PDF Downloads 102