Search results for: large scale ERP
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3198

Search results for: large scale ERP

2538 Computer Countenanced Diagnosis of Skin Nodule Detection and Histogram Augmentation: Extracting System for Skin Cancer

Authors: S. Zith Dey Babu, S. Kour, S. Verma, C. Verma, V. Pathania, A. Agrawal, V. Chaudhary, A. Manoj Puthur, R. Goyal, A. Pal, T. Danti Dey, A. Kumar, K. Wadhwa, O. Ved

Abstract:

Background: Skin cancer is now is the buzzing button in the field of medical science. The cyst's pandemic is drastically calibrating the body and well-being of the global village. Methods: The extracted image of the skin tumor cannot be used in one way for diagnosis. The stored image contains anarchies like the center. This approach will locate the forepart of an extracted appearance of skin. Partitioning image models has been presented to sort out the disturbance in the picture. Results: After completing partitioning, feature extraction has been formed by using genetic algorithm and finally, classification can be performed between the trained and test data to evaluate a large scale of an image that helps the doctors for the right prediction. To bring the improvisation of the existing system, we have set our objectives with an analysis. The efficiency of the natural selection process and the enriching histogram is essential in that respect. To reduce the false-positive rate or output, GA is performed with its accuracy. Conclusions: The objective of this task is to bring improvisation of effectiveness. GA is accomplishing its task with perfection to bring down the invalid-positive rate or outcome. The paper's mergeable portion conflicts with the composition of deep learning and medical image processing, which provides superior accuracy. Proportional types of handling create the reusability without any errors.

Keywords: Computer-aided system, detection, image segmentation, morphology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 543
2537 Preliminary Design of Frozen Soil Simulation System Based on Finite Element Simulation

Authors: Wenyu Song, Bingxi Li, Zhongbin Fu, Baocheng Jiang

Abstract:

Full - Scale Accelerated Loading System, one part of “the Eleventh - Five - Year National Grand Technology Infrastructure Program" is a facility to evaluate the performance and service life of different kinds of pavements subjected to traffic loading under full - controlled environment. While simulating the environments of frigid zone and permafrost zone, the accurate control of air temperature, road temperature and roadbed temperature are the key points and also aporias for the designment. In this paper, numerical simulations are used to determine the design parameters of the frozen soil simulation system. At first, a brief introduction of the Full - Scale Accelerate Loading System was given. Then, the temperature control method of frozen soil simulation system was proposed. Finally, by using finite element simulations, the optimal design of frozen soil simulation system was obtained. This proposed design, which was obtained by finite element simulations, provided significant referents to the ultimate design of the environment simulation system.

Keywords: China, finite element simulation, frozen soilsimulation system, preliminary design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1581
2536 A Study on Architectural Characteristics‎ of Traditional Iranian Ordinary Houses in Mashhad, Iran

Authors: Rana Daneshvar Salehi

Abstract:

In many Iranian cities including ‎‎Mashhad‎, the capital of ‎‎‎‎Razavi Khorasan Province‎, ‎ordinary samples of domestic architecture ‎on a ‎small scale is not ‎‎‎considered as ‎heritage. ‎While the ‎principals of house formation are ‎‎respected in all ‎‎traditional Iranian ‎‎‎‎houses‎; ‎from moderate to great ones. During the past decade, Mashhad has lost its identity, and has become a modern city. Identifying it as the capital of the Islamic Culture in 2017 by ISESCO and consequently looking for new developments and transfiguration caused to demolish a large ‎number ‎of ‎traditional modest habitation. ‎For this ‎reason, the present paper aims to introduce ‎the three ‎undiscovered houses with the ‎historical and monumental values located in the ‎oldest ‎neighborhoods of Mashhad which have been neglected in the cultural ‎heritage field. The preliminary phase of this approach will be a measured survey to identify the significant characteristics ‎of ‎selected dwellings and understand the challenges through focusing on building ‎form, orientation, ‎‎room function, space proportion and ornamental elements’ details. A comparison between the ‎‎case studies and the wealthy domestically buildings ‎presents that a house belongs to inhabitants ‎with an average income could introduce the same accurate, regular, harmonic and proportionate ‎design which can be found in the great mansions. It reveals that an ordinary traditional house can ‎be regarded as valuable construction not only for its historical characteristics but also ‎for its ‎aesthetical and architectural features that could avoid further destructions in the future.

Keywords: Traditional ordinary house, architectural characteristic, proportion, heritage.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 804
2535 Solving the Teacher Assignment-Course Scheduling Problem by a Hybrid Algorithm

Authors: Aldy Gunawan, Kien Ming Ng, Kim Leng Poh

Abstract:

This paper presents a hybrid algorithm for solving a timetabling problem, which is commonly encountered in many universities. The problem combines both teacher assignment and course scheduling problems simultaneously, and is presented as a mathematical programming model. However, this problem becomes intractable and it is unlikely that a proven optimal solution can be obtained by an integer programming approach, especially for large problem instances. A hybrid algorithm that combines an integer programming approach, a greedy heuristic and a modified simulated annealing algorithm collaboratively is proposed to solve the problem. Several randomly generated data sets of sizes comparable to that of an institution in Indonesia are solved using the proposed algorithm. Computational results indicate that the algorithm can overcome difficulties of large problem sizes encountered in previous related works.

Keywords: Timetabling problem, mathematical programming model, hybrid algorithm, simulated annealing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4571
2534 Fluidised Bed Gasification of Multiple Agricultural Biomass Derived Briquettes

Authors: Rukayya Ibrahim Muazu, Aiduan Li Borrion, Julia A. Stegemann

Abstract:

Biomass briquette gasification is regarded as a promising route for efficient briquette use in energy generation, fuels and other useful chemicals. However, previous research has been focused on briquette gasification in fixed bed gasifiers such as updraft and downdraft gasifiers. Fluidised bed gasifier has the potential to be effectively sized to medium or large scale. This study investigated the use of fuel briquettes produced from blends of rice husks and corn cobs biomass, in a bubbling fluidised bed gasifier. The study adopted a combination of numerical equations and Aspen Plus simulation software, to predict the product gas (syngas) composition base on briquette density and biomass composition (blend ratio of rice husks to corn cobs). The Aspen Plus model was based on an experimentally validated model from the literature. The results based on a briquette size 32 mm diameter and relaxed density range of 500 to 650kg/m3, indicated that fluidisation air required in the gasifier increased with increase in briquette density, and the fluidisation air showed to be the controlling factor compared with the actual air required for gasification of the biomass briquettes. The mass flowrate of CO2 in the predicted syngas composition increased with an increase in air flow, in the gasifier, while CO decreased and H2 was almost constant. The ratio of H2 to CO for various blends of rice husks and corn cobs did not significantly change at the designed process air, but a significant difference of 1.0 was observed between 10/90 and 90/10 % blend of rice husks and corn cobs.

Keywords: Briquettes, fluidised bed, gasification, Aspen Plus, syngas.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2549
2533 Extracting the Coupled Dynamics in Thin-Walled Beams from Numerical Data Bases

Authors: Mohammad A. Bani-Khaled

Abstract:

In this work we use the Discrete Proper Orthogonal Decomposition transform to characterize the properties of coupled dynamics in thin-walled beams by exploiting numerical simulations obtained from finite element simulations. The outcomes of the will improve our understanding of the linear and nonlinear coupled behavior of thin-walled beams structures. Thin-walled beams have widespread usage in modern engineering application in both large scale structures (aeronautical structures), as well as in nano-structures (nano-tubes). Therefore, detailed knowledge in regard to the properties of coupled vibrations and buckling in these structures are of great interest in the research community. Due to the geometric complexity in the overall structure and in particular in the cross-sections it is necessary to involve computational mechanics to numerically simulate the dynamics. In using numerical computational techniques, it is not necessary to over simplify a model in order to solve the equations of motions. Computational dynamics methods produce databases of controlled resolution in time and space. These numerical databases contain information on the properties of the coupled dynamics. In order to extract the system dynamic properties and strength of coupling among the various fields of the motion, processing techniques are required. Time- Proper Orthogonal Decomposition transform is a powerful tool for processing databases for the dynamics. It will be used to study the coupled dynamics of thin-walled basic structures. These structures are ideal to form a basis for a systematic study of coupled dynamics in structures of complex geometry.

Keywords: Coupled dynamics, geometric complexity, Proper Orthogonal Decomposition (POD), thin walled beams.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1015
2532 Comparing Data Analysis, Communication and Information Technologies Expertise Levels in Undergraduate Psychology Students

Authors: Ana Cázares

Abstract:

Aims for this study: first, to compare the expertise level in data analysis, communication and information technologies in undergraduate psychology students. Second, to verify the factor structure of E-ETICA (Escala de Experticia en Tecnologias de la Informacion, la Comunicacion y el Análisis or Data Analysis, Communication and Information'Expertise Scale) which had shown an excellent internal consistency (α= 0.92) as well as a simple factor structure. Three factors, Complex, Basic Information and Communications Technologies and E-Searching and Download Abilities, explains 63% of variance. In the present study, 260 students (119 juniors and 141 seniors) were asked to respond to ETICA (16 items Likert scale of five points 1: null domain to 5: total domain). The results show that both junior and senior students report having very similar expertise level; however, E-ETICA presents a different factor structure for juniors and four factors explained also 63% of variance: Information E-Searching, Download and Process; Data analysis; Organization; and Communication technologies.

Keywords: Data analysis, Information, Communications Technologies, Expertise'Levels.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1285
2531 Optimal and Critical Path Analysis of State Transportation Network Using Neo4J

Authors: Pallavi Bhogaram, Xiaolong Wu, Min He, Onyedikachi Okenwa

Abstract:

A transportation network is a realization of a spatial network, describing a structure which permits either vehicular movement or flow of some commodity. Examples include road networks, railways, air routes, pipelines, and many more. The transportation network plays a vital role in maintaining the vigor of the nation’s economy. Hence, ensuring the network stays resilient all the time, especially in the face of challenges such as heavy traffic loads and large scale natural disasters, is of utmost importance. In this paper, we used the Neo4j application to develop the graph. Neo4j is the world's leading open-source, NoSQL, a native graph database that implements an ACID-compliant transactional backend to applications. The Southern California network model is developed using the Neo4j application and obtained the most critical and optimal nodes and paths in the network using centrality algorithms. The edge betweenness centrality algorithm calculates the critical or optimal paths using Yen's k-shortest paths algorithm, and the node betweenness centrality algorithm calculates the amount of influence a node has over the network. The preliminary study results confirm that the Neo4j application can be a suitable tool to study the important nodes and the critical paths for the major congested metropolitan area.

Keywords: Transportation network, critical path, connectivity reliability, network model, Neo4J application, optimal path, critical path, edge betweenness centrality index, node betweenness centrality index, Yen’s k-shortest paths.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 851
2530 Measuring Creativity in Die Products for Technological Education

Authors: Ching-Yi Lee, Dyi-Cheng Chen, Bo-Yan Lai, Chin-Pin Chen

Abstract:

Creative design requires new approaches to assessment in vocational and technological education. To date, there has been little discussion on instruments used to evaluate dies produced by students in vocational and technological education. Developing a generic instrument has been very difficult due to the diversity of creative domains, the specificity of content, and the subjectivity involved in judgment. This paper presents an instrument for measuring the creativity in the design of products by expanding the Consensual Assessment Technique (CAT). The content-based scale was evaluated for content validity by 5 experts. The scale comprises 5 criteria: originality; practicability; precision; aesthetics; and exchangeability. Nine experts were invited to evaluate the dies produced by 38 college students who enrolled in a Product Design and Development course. To further explore the degree of rater agreement, inter-rater reliability was calculated for each dimension using Kendall's coefficient of concordance test. The inter-judge reliability scores achieved significance, with coefficients ranging from 0.53 to 0.71.

Keywords: Design education, die creative product, vocational and technological education, Consensual Assessment Technique (CAT).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1865
2529 DIFFER: A Propositionalization approach for Learning from Structured Data

Authors: Thashmee Karunaratne, Henrik Böstrom

Abstract:

Logic based methods for learning from structured data is limited w.r.t. handling large search spaces, preventing large-sized substructures from being considered by the resulting classifiers. A novel approach to learning from structured data is introduced that employs a structure transformation method, called finger printing, for addressing these limitations. The method, which generates features corresponding to arbitrarily complex substructures, is implemented in a system, called DIFFER. The method is demonstrated to perform comparably to an existing state-of-art method on some benchmark data sets without requiring restrictions on the search space. Furthermore, learning from the union of features generated by finger printing and the previous method outperforms learning from each individual set of features on all benchmark data sets, demonstrating the benefit of developing complementary, rather than competing, methods for structure classification.

Keywords: Machine learning, Structure classification, Propositionalization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1221
2528 Efficiency for Sustainable Growth: Evidence from the North African Countries

Authors: Ekrem Erdem, Can Tansel Tugcu

Abstract:

Improved resource efficiency of production is a key requirement for sustainable growth, worldwide. In this regards, by considering the energy and tourism as the extra inputs to the classical Coub-Douglas production function, this study aims at investigating the efficiency changes in the North African countries. To this end, the study uses panel data for the period 1995-2010 and adopts the Malmquist index based on the data envelopment analysis. Results show that tourism increases technical and scale efficiencies, while it decreases technological and total factor productivity changes. On the other hand, when the production function is augmented by the energy input; technical efficiency change decreases, while the technological change, scale efficiency change and total factor productivity change increase. Thus, in order to satisfy the needs for sustainable growth, North African governments should take some measures for increasing the contribution that the tourism makes to economic growth and some others for efficient use of resources in the energy sector.

Keywords: Data envelopment analysis, Economic efficiency, North African countries, Sustainable growth.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1757
2527 Visual Thing Recognition with Binary Scale-Invariant Feature Transform and Support Vector Machine Classifiers Using Color Information

Authors: Wei-Jong Yang, Wei-Hau Du, Pau-Choo Chang, Jar-Ferr Yang, Pi-Hsia Hung

Abstract:

The demands of smart visual thing recognition in various devices have been increased rapidly for daily smart production, living and learning systems in recent years. This paper proposed a visual thing recognition system, which combines binary scale-invariant feature transform (SIFT), bag of words model (BoW), and support vector machine (SVM) by using color information. Since the traditional SIFT features and SVM classifiers only use the gray information, color information is still an important feature for visual thing recognition. With color-based SIFT features and SVM, we can discard unreliable matching pairs and increase the robustness of matching tasks. The experimental results show that the proposed object recognition system with color-assistant SIFT SVM classifier achieves higher recognition rate than that with the traditional gray SIFT and SVM classification in various situations.

Keywords: Color moments, visual thing recognition system, SIFT, color SIFT.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1031
2526 Information System for Early Diabetic Retinopathy Diagnostics Based on Multiscale Texture Gradient Method

Authors: L. S. Godlevsky, N. V. Kresyun, V. P. Martsenyuk, K. S. Shakun, T. V. Tatarchuk, K. O. Prybolovets, L. F. Kalinichenko, M. Karpinski, T. Gancarczyk

Abstract:

Structures of eye bottom were extracted using multiscale texture gradient method and color characteristics of macular zone and vessels were verified in CIELAB scale. The difference of average values of L*, a* and b* coordinates of CIE (International Commision of Illumination) scale in patients with diabetes and healthy volunteers was compared. The average value of L* in diabetic patients exceeded such one in the group of practically healthy persons by 2.71 times (P < 0.05), while the value of a* index was reduced by 3.8 times when compared with control one (P < 0.05). b* index exceeded such one in the control group by 12.4 times (P < 0.05). The integrated index on color difference (ΔE) exceeded control value by 2.87 times (P < 0.05). More pronounced differences with ΔE were followed by a shorter period of MA appearance with a correlation level at -0.56 (P < 0.05). The specificity of diagnostics raised by 2.17 times (P < 0.05) and negative prognostic index exceeded such one determined with the expert method by 2.26 times (P < 0.05).

Keywords: Diabetic retinopathy, multiscale texture gradient, color spectrum analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 575
2525 Novel Rao-Blackwellized Particle Filter for Mobile Robot SLAM Using Monocular Vision

Authors: Maohai Li, Bingrong Hong, Zesu Cai, Ronghua Luo

Abstract:

This paper presents the novel Rao-Blackwellised particle filter (RBPF) for mobile robot simultaneous localization and mapping (SLAM) using monocular vision. The particle filter is combined with unscented Kalman filter (UKF) to extending the path posterior by sampling new poses that integrate the current observation which drastically reduces the uncertainty about the robot pose. The landmark position estimation and update is also implemented through UKF. Furthermore, the number of resampling steps is determined adaptively, which seriously reduces the particle depletion problem, and introducing the evolution strategies (ES) for avoiding particle impoverishment. The 3D natural point landmarks are structured with matching Scale Invariant Feature Transform (SIFT) feature pairs. The matching for multi-dimension SIFT features is implemented with a KD-Tree in the time cost of O(log2 N). Experiment results on real robot in our indoor environment show the advantages of our methods over previous approaches.

Keywords: Mobile robot, simultaneous localization and mapping, Rao-Blackwellised particle filter, evolution strategies, scale invariant feature transform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2144
2524 A Meta-Heuristic Algorithm for Set Covering Problem Based on Gravity

Authors: S. Raja Balachandar, K. Kannan

Abstract:

A new Meta heuristic approach called "Randomized gravitational emulation search algorithm (RGES)" for solving large size set covering problems has been designed. This algorithm is found upon introducing randomization concept along with the two of the four primary parameters -velocity- and -gravity- in physics. A new heuristic operator is introduced in the domain of RGES to maintain feasibility specifically for the set covering problem to yield best solutions. The performance of this algorithm has been evaluated on a large set of benchmark problems from OR-library. Computational results showed that the randomized gravitational emulation search algorithm - based heuristic is capable of producing high quality solutions. The performance of this heuristic when compared with other existing heuristic algorithms is found to be excellent in terms of solution quality.

Keywords: Set covering problem, velocity, gravitational force, Newton's law, meta heuristic, combinatorial optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2229
2523 U.S. Nuclear Regulatory CommissionTraining for Research and Training Reactor Inspectors

Authors: Gary Marlin Sandquist

Abstract:

Currently, a large number of license activities (Early Site Permits, Combined Operating License, reactor certifications, etc.), are pending for review before the United States Nuclear Regulatory Commission (US NRC). Much of the senior staff at the NRC is now committed to these review and licensing actions. To address this additional workload, the NRC has recruited a large number of new Regulatory Staff for dealing with these and other regulatory actions such as the US Fleet of Research and Test Reactors (RTRs). These reactors pose unusual demands on Regulatory Staff since the US Fleet of RTRs, although few (32 Licensed RTRs as of 2010), they represent a broad range of reactor types, operations, and research and training aspects that nuclear reactor power plants (such as the 104 LWRs) do not pose. The NRC must inspect and regulate all these facilities. This paper addresses selected training topics and regulatory activities providedNRC Inspectors for RTRs.

Keywords: Regulations, Research and Test Reactors, Training, US NRC

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1642
2522 A Dual Fitness Function Genetic Algorithm: Application on Deterministic Identical Machine Scheduling

Authors: Saleem Z. Ramadan, Gürsel A. Süer

Abstract:

In this paper a genetic algorithm (GA) with dual-fitness function is proposed and applied to solve the deterministic identical machine scheduling problem. The mating fitness function value was used to determine the mating for chromosomes, while the selection fitness function value was used to determine their survivals. The performance of this algorithm was tested on deterministic identical machine scheduling using simulated data. The results obtained from the proposed GA were compared with classical GA and integer programming (IP). Results showed that dual-fitness function GA outperformed the classical single-fitness function GA with statistical significance for large problems and was competitive to IP, particularly when large size problems were used.

Keywords: Machine scheduling, Genetic algorithms, Due dates, Number of tardy jobs, Number of early jobs, Integer programming, Dual Fitness functions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2068
2521 The Effects of Extracorporeal Shockwave Therapy on Pain, Function, Range of Motion, and Strength in Patients with Insertional Achilles Tendinosis

Authors: P. Sanzo

Abstract:

Increased physical fitness participation has been paralleled by increasedoveruse injuries such as insertional Achilles tendinosis (AT). Treatment has provided inconsistentresults. The use of extracorporeal shockwave therapy (ECSWT) offers a new treatment consideration.The purpose of this study was to assess the effects of ECSWTon pain, function, range of motion (ROM), joint mobility and strength in patients with AT. Thirty subjects were treated with ECSWT and measures were takenbefore and three months after treatment. There was significant differences in visual analog scale (VAS) scores for pain at rest (p=0.002); after activity (p= 0.0001); overall improvement(p=0.0001); Lower Extremity Functional Scale (LEFS) scores (p=0.002); dorsiflexion range of motion (ROM) (p=0.0001); plantarflexion strength (p=0.025); talocrural joint anterior glide (p=0.046); and subtalar joint medial and lateral glide (p=0.025).ECSWT offers a new intervention that may limit the progression of the disorder and the long term healthcare costs associated with AT.

Keywords: Extracorporeal shockwave therapy, shockwave therapy, Achilles tendinosis, range of motion, strength, joint mobility

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1583
2520 Applying Spanning Tree Graph Theory for Automatic Database Normalization

Authors: Chetneti Srisa-an

Abstract:

In Knowledge and Data Engineering field, relational database is the best repository to store data in a real world. It has been using around the world more than eight decades. Normalization is the most important process for the analysis and design of relational databases. It aims at creating a set of relational tables with minimum data redundancy that preserve consistency and facilitate correct insertion, deletion, and modification. Normalization is a major task in the design of relational databases. Despite its importance, very few algorithms have been developed to be used in the design of commercial automatic normalization tools. It is also rare technique to do it automatically rather manually. Moreover, for a large and complex database as of now, it make even harder to do it manually. This paper presents a new complete automated relational database normalization method. It produces the directed graph and spanning tree, first. It then proceeds with generating the 2NF, 3NF and also BCNF normal forms. The benefit of this new algorithm is that it can cope with a large set of complex function dependencies.

Keywords: Relational Database, Functional Dependency, Automatic Normalization, Primary Key, Spanning tree.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2865
2519 Sulphur-Mediated Precipitation of Pt/Fe/Co/CrIons in Liquid-Liquid and Gas-Liquid Chloride Systems

Authors: J. Siame, H. Kasaini

Abstract:

The proof of concept experiments were conducted to determine the feasibility of using small amounts of Dissolved Sulphur (DS) from the gaseous phase to precipitate platinum ions in chloride media. Two sets of precipitation experiments were performed in which the source of sulphur atoms was either a thiosulphate solution (Na2S2O3) or a sulphur dioxide gas (SO2). In liquid-liquid (L-L) system, complete precipitation of Pt was achieved at small dosages of Na2S2O3 (0.01 – 1.0 M) in a time interval of 3-5 minutes. On the basis of this result, gas absorption tests were carried out mainly to achieve sulphur solubility equivalent to 0.018 M. The idea that huge amounts of precious metals could be recovered selectively from their dilute solutions by utilizing the waste SO2 streams at low pressure seemed attractive from the economic and environmental point of views. Therefore, mass transfer characteristics of SO2 gas associated with reactive absorption across the gas-liquid (G-L) interface were evaluated under different conditions of pressure (0.5 – 2 bar), solution temperature ranges from 20 – 50 oC and acid strength (1 – 4 M, HCl). This paper concludes with information about selective precipitation of Pt in the presence of cations (Fe2+, Co2+, and Cr3+) in a CSTR and recommendation to scale up laboratory data to industrial pilot scale operations.

Keywords: CSTR, diffusivity, platinum, selective precipitation, sulphur dioxide, thiosulphate.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2155
2518 A Study of Gaps in CBMIR Using Different Methods and Prospective

Authors: Pradeep Singh, Sukhwinder Singh, Gurjinder Kaur

Abstract:

In recent years, rapid advances in software and hardware in the field of information technology along with a digital imaging revolution in the medical domain facilitate the generation and storage of large collections of images by hospitals and clinics. To search these large image collections effectively and efficiently poses significant technical challenges, and it raises the necessity of constructing intelligent retrieval systems. Content-based Image Retrieval (CBIR) consists of retrieving the most visually similar images to a given query image from a database of images[5]. Medical CBIR (content-based image retrieval) applications pose unique challenges but at the same time offer many new opportunities. On one hand, while one can easily understand news or sports videos, a medical image is often completely incomprehensible to untrained eyes.

Keywords: Classification, clustering, content-based image retrieval (CBIR), relevance feedback (RF), statistical similarity matching, support vector machine (SVM).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1786
2517 Estimation of Relative Permeabilities and Capillary Pressures in Shale Using Simulation Method

Authors: F. C. Amadi, G. C. Enyi, G. Nasr

Abstract:

Relative permeabilities are practical factors that are used to correct the single phase Darcy’s law for application to multiphase flow. For effective characterisation of large-scale multiphase flow in hydrocarbon recovery, relative permeability and capillary pressures are used. These parameters are acquired via special core flooding experiments. Special core analysis (SCAL) module of reservoir simulation is applied by engineers for the evaluation of these parameters. But, core flooding experiments in shale core sample are expensive and time consuming before various flow assumptions are achieved for instance Darcy’s law. This makes it imperative for the application of coreflooding simulations in which various analysis of relative permeabilities and capillary pressures of multiphase flow can be carried out efficiently and effectively at a relative pace. This paper presents a Sendra software simulation of core flooding to achieve to relative permeabilities and capillary pressures using different correlations. The approach used in this study was three steps. The first step, the basic petrophysical parameters of Marcellus shale sample such as porosity was determined using laboratory techniques. Secondly, core flooding was simulated for particular scenario of injection using different correlations. And thirdly the best fit correlations for the estimation of relative permeability and capillary pressure was obtained. This research approach saves cost and time and very reliable in the computation of relative permeability and capillary pressures at steady or unsteady state, drainage or imbibition processes in oil and gas industry when compared to other methods.

Keywords: Special core analysis (SCAL), relative permeability, capillary pressures, drainage, imbibition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1815
2516 Airfoils Aerodynamic Efficiency Study in Heavy Rain via Two Phase Flow Approach

Authors: M. Ismail, Cao Yihua, Zhao Ming

Abstract:

Heavy rainfall greatly affects the aerodynamic performance of the aircraft. There are many accidents of aircraft caused by aerodynamic efficiency degradation by heavy rain. In this Paper we have studied the heavy rain effects on the aerodynamic efficiency of NACA 64-210 & NACA 0012 airfoils. For our analysis, CFD method and preprocessing grid generator are used as our main analytical tools, and the simulation of rain is accomplished via two phase flow approach-s Discrete Phase Model (DPM). Raindrops are assumed to be non-interacting, non-deforming, non-evaporating and non-spinning spheres. Both airfoil sections exhibited significant reduction in lift and increase in drag for a given lift condition in simulated rain. The most significant difference between these two airfoils was the sensitivity of the NACA 64-210 to liquid water content (LWC), while NACA 0012 performance losses in the rain environment is not a function of LWC . It is expected that the quantitative information gained in this paper will be useful to the operational airline industry and greater effort such as small scale and full scale flight tests should put in this direction to further improve aviation safety.

Keywords: airfoil, discrete phase modeling, heavy rain, Reynolds number

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3639
2515 The Effect of Main Factors on Forces during FSJ Processing of AA2024 Aluminum

Authors: Dunwen Zuo, Yongfang Deng, Bo Song

Abstract:

An attempt is made here to measure the forces of three directions, under conditions of different feed speeds, different tilt angles of tool and without or with the pin on the tool, by using octagonal ring dynamometer in the AA2024 aluminum FSJ (Friction Stir Joining) process, and investigate how four main factors influence forces in the FSJ process. It is found that, high feed speed lead to small feed force and small lateral force, but high feed speed leads to large feed force in the stable joining stage of process. As the rotational speed increasing, the time of axial force drop from the maximum to the minimum required increased in the push-up process. In the stable joining stage, the rotational speed has little effect on the feed force; large rotational speed leads to small lateral force and axial force. The maximum axial force increases as the tilt angle of tool increases at the downward movement stage. At the moment of start feeding, as tilt angle of tool increases, the amplitudes of the axial force increasing become large. In the stable joining stage, with the increase of tilt angle of tool, the axial force is increased, the lateral force is decreased, and the feed force almost unchanged. The tool with pin will decrease axial force in the downward movement stage. The feed force and lateral force will increase, but the axial force will reduced in the stable joining stage by using the tool with pin compare to by using the tool without pin.

Keywords: FSJ, force factor, AA2024, friction stir joining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1137
2514 Assessing the Physiological, Psychological Stressors and Coping Strategies among Hemodialysis Patients in the Kingdom of Saudi Arabia

Authors: A. Seham A. Elgamal, Reham H. Saleh

Abstract:

Chronic kidney disease became a global health problem worldwide. Therefore, in order to maintain a patient’s life and improve the survival rate, hemodialysis is essential to replace the function of their kidneys. However, those patients may complain about multiple physical and psychological stressors due to the nature of the disease and the need for frequent hemodialysis sessions. So, those patients use various strategies to cope with the stressors related to their disease and the treatment procedures. Cross-sectional, descriptive study was carried out to achieve the aim of the study. A convenient sample including all adult patients was recruited for this study. Hemodialysis Stressors Scale (HSS) and Jalowiec Coping Scale (JCS) were used to investigate the stressors and coping strategies of 89 hemodialysis patients, at a governmental hospital (King Khalid Hospital-Jeddah). Results of the study revealed that 50.7% experienced physiological stressors and 38% experienced psychosocial stressors. Also, optimistic, fatalistic, and supportive coping strategies were the most common coping strategies used by the patients with mean scores (2.88 + 0.75, 2.87 + 0.75, and 1.82 + 0.71), respectively. In conclusion, being familiar with the types of stressors and the effective coping strategies of hemodialysis patients and their families are important in order to enhance their adaptation with chronic kidney diseases.

Keywords: Coping strategies, hemodialysis, physiological stressors, psychological stressors.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1112
2513 Evaluating some Feature Selection Methods for an Improved SVM Classifier

Authors: Daniel Morariu, Lucian N. Vintan, Volker Tresp

Abstract:

Text categorization is the problem of classifying text documents into a set of predefined classes. After a preprocessing step the documents are typically represented as large sparse vectors. When training classifiers on large collections of documents, both the time and memory restrictions can be quite prohibitive. This justifies the application of features selection methods to reduce the dimensionality of the document-representation vector. Four feature selection methods are evaluated: Random Selection, Information Gain (IG), Support Vector Machine (called SVM_FS) and Genetic Algorithm with SVM (GA_FS). We showed that the best results were obtained with SVM_FS and GA_FS methods for a relatively small dimension of the features vector comparative with the IG method that involves longer vectors, for quite similar classification accuracies. Also we present a novel method to better correlate SVM kernel-s parameters (Polynomial or Gaussian kernel).

Keywords: Features selection, learning with kernels, support vector machine, genetic algorithms and classification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1537
2512 Lattice Boltzmann Method for Turbulent Heat Transfer in Wavy Channel Flows

Authors: H.Y. Lai, S. C. Chang, W. L. Chen

Abstract:

The hydrodynamic and thermal lattice Boltzmann methods are applied to investigate the turbulent convective heat transfer in the wavy channel flows. In this study, the turbulent phenomena are modeling by large-eddy simulations with the Smagorinsky model. As a benchmark, the laminar and turbulent backward-facing step flows are simulated first. The results give good agreement with other numerical and experimental data. For wavy channel flows, the distribution of Nusselt number and the skin-friction coefficients are calculated to evaluate the heat transfer effect and the drag force. It indicates that the vortices at the trough would affect the magnitude of drag and weaken the heat convection effects on the wavy surface. In turbulent cases, if the amplitude of the wavy boundary is large enough, the secondary vortices would be generated at troughs and contribute to the heat convection. Finally, the effects of different Re on the turbulent transport phenomena are discussed.

Keywords: Heat transfer, lattice Boltzmann method, turbulence, wavy channel.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2500
2511 Large Eddy Simulation of Hydrogen Deflagration in Open Space and Vented Enclosure

Authors: T. Nozu, K. Hibi, T. Nishiie

Abstract:

This paper discusses the applicability of the numerical model for a damage prediction method of the accidental hydrogen explosion occurring in a hydrogen facility. The numerical model was based on an unstructured finite volume method (FVM) code “NuFD/FrontFlowRed”. For simulating unsteady turbulent combustion of leaked hydrogen gas, a combination of Large Eddy Simulation (LES) and a combustion model were used. The combustion model was based on a two scalar flamelet approach, where a G-equation model and a conserved scalar model expressed a propagation of premixed flame surface and a diffusion combustion process, respectively. For validation of this numerical model, we have simulated the previous two types of hydrogen explosion tests. One is open-space explosion test, and the source was a prismatic 5.27 m3 volume with 30% of hydrogen-air mixture. A reinforced concrete wall was set 4 m away from the front surface of the source. The source was ignited at the bottom center by a spark. The other is vented enclosure explosion test, and the chamber was 4.6 m × 4.6 m × 3.0 m with a vent opening on one side. Vent area of 5.4 m2 was used. Test was performed with ignition at the center of the wall opposite the vent. Hydrogen-air mixtures with hydrogen concentrations close to 18% vol. were used in the tests. The results from the numerical simulations are compared with the previous experimental data for the accuracy of the numerical model, and we have verified that the simulated overpressures and flame time-of-arrival data were in good agreement with the results of the previous two explosion tests.

Keywords: Deflagration, Large Eddy Simulation, Turbulent combustion, Vented enclosure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1476
2510 A Survey of Field Programmable Gate Array-Based Convolutional Neural Network Accelerators

Authors: Wei Zhang

Abstract:

With the rapid development of deep learning, neural network and deep learning algorithms play a significant role in various practical applications. Due to the high accuracy and good performance, Convolutional Neural Networks (CNNs) especially have become a research hot spot in the past few years. However, the size of the networks becomes increasingly large scale due to the demands of the practical applications, which poses a significant challenge to construct a high-performance implementation of deep learning neural networks. Meanwhile, many of these application scenarios also have strict requirements on the performance and low-power consumption of hardware devices. Therefore, it is particularly critical to choose a moderate computing platform for hardware acceleration of CNNs. This article aimed to survey the recent advance in Field Programmable Gate Array (FPGA)-based acceleration of CNNs. Various designs and implementations of the accelerator based on FPGA under different devices and network models are overviewed, and the versions of Graphic Processing Units (GPUs), Application Specific Integrated Circuits (ASICs) and Digital Signal Processors (DSPs) are compared to present our own critical analysis and comments. Finally, we give a discussion on different perspectives of these acceleration and optimization methods on FPGA platforms to further explore the opportunities and challenges for future research. More helpfully, we give a prospect for future development of the FPGA-based accelerator.

Keywords: Deep learning, field programmable gate array, FPGA, hardware acceleration, convolutional neural networks, CNN.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 892
2509 Retail Strategy to Reduce Waste Keeping High Profit Utilizing Taylor's Law in Point-of-Sales Data

Authors: Gen Sakoda, Hideki Takayasu, Misako Takayasu

Abstract:

Waste reduction is a fundamental problem for sustainability. Methods for waste reduction with point-of-sales (POS) data are proposed, utilizing the knowledge of a recent econophysics study on a statistical property of POS data. Concretely, the non-stationary time series analysis method based on the Particle Filter is developed, which considers abnormal fluctuation scaling known as Taylor's law. This method is extended for handling incomplete sales data because of stock-outs by introducing maximum likelihood estimation for censored data. The way for optimal stock determination with pricing the cost of waste reduction is also proposed. This study focuses on the examination of the methods for large sales numbers where Taylor's law is obvious. Numerical analysis using aggregated POS data shows the effectiveness of the methods to reduce food waste maintaining a high profit for large sales numbers. Moreover, the way of pricing the cost of waste reduction reveals that a small profit loss realizes substantial waste reduction, especially in the case that the proportionality constant  of Taylor’s law is small. Specifically, around 1% profit loss realizes half disposal at =0.12, which is the actual  value of processed food items used in this research. The methods provide practical and effective solutions for waste reduction keeping a high profit, especially with large sales numbers.

Keywords: Food waste reduction, particle filter, point of sales, sustainable development goals, Taylor's Law, time series analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 870