Search results for: rough and finish rolling processes
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1891

Search results for: rough and finish rolling processes

1261 Performance Analysis of Search Medical Imaging Service on Cloud Storage Using Decision Trees

Authors: González A. Julio, Ramírez L. Leonardo, Puerta A. Gabriel

Abstract:

Telemedicine services use a large amount of data, most of which are diagnostic images in Digital Imaging and Communications in Medicine (DICOM) and Health Level Seven (HL7) formats. Metadata is generated from each related image to support their identification. This study presents the use of decision trees for the optimization of information search processes for diagnostic images, hosted on the cloud server. To analyze the performance in the server, the following quality of service (QoS) metrics are evaluated: delay, bandwidth, jitter, latency and throughput in five test scenarios for a total of 26 experiments during the loading and downloading of DICOM images, hosted by the telemedicine group server of the Universidad Militar Nueva Granada, Bogotá, Colombia. By applying decision trees as a data mining technique and comparing it with the sequential search, it was possible to evaluate the search times of diagnostic images in the server. The results show that by using the metadata in decision trees, the search times are substantially improved, the computational resources are optimized and the request management of the telemedicine image service is improved. Based on the experiments carried out, search efficiency increased by 45% in relation to the sequential search, given that, when downloading a diagnostic image, false positives are avoided in management and acquisition processes of said information. It is concluded that, for the diagnostic images services in telemedicine, the technique of decision trees guarantees the accessibility and robustness in the acquisition and manipulation of medical images, in improvement of the diagnoses and medical procedures in patients.

Keywords: Cloud storage, decision trees, diagnostic image, search, telemedicine.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 912
1260 Nanofluid-Based Emulsion Liquid Membrane for Selective Extraction and Separation of Dysprosium

Authors: Maliheh Raji, Hossein Abolghasemi, Jaber Safdari, Ali Kargari

Abstract:

Dysprosium is a rare earth element which is essential for many growing high-technology applications. Dysprosium along with neodymium plays a significant role in different applications such as metal halide lamps, permanent magnets, and nuclear reactor control rods preparation. The purification and separation of rare earth elements are challenging because of their similar chemical and physical properties. Among the various methods, membrane processes provide many advantages over the conventional separation processes such as ion exchange and solvent extraction. In this work, selective extraction and separation of dysprosium from aqueous solutions containing an equimolar mixture of dysprosium and neodymium by emulsion liquid membrane (ELM) was investigated. The organic membrane phase of the ELM was a nanofluid consisting of multiwalled carbon nanotubes (MWCNT), Span80 as surfactant, Cyanex 272 as carrier, kerosene as base fluid, and nitric acid solution as internal aqueous phase. Factors affecting separation of dysprosium such as carrier concentration, MWCNT concentration, feed phase pH and stripping phase concentration were analyzed using Taguchi method. Optimal experimental condition was obtained using analysis of variance (ANOVA) after 10 min extraction. Based on the results, using MWCNT nanofluid in ELM process leads to increase the extraction due to higher stability of membrane and mass transfer enhancement and separation factor of 6 for dysprosium over neodymium can be achieved under the optimum conditions. Additionally, demulsification process was successfully performed and the membrane phase reused effectively in the optimum condition.

Keywords: Emulsion liquid membrane, MWCNT nanofluid, separation, Taguchi Method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 961
1259 A Construction Management Tool: Determining a Project Schedule Typical Behaviors Using Cluster Analysis

Authors: Natalia Rudeli, Elisabeth Viles, Adrian Santilli

Abstract:

Delays in the construction industry are a global phenomenon. Many construction projects experience extensive delays exceeding the initially estimated completion time. The main purpose of this study is to identify construction projects typical behaviors in order to develop a prognosis and management tool. Being able to know a construction projects schedule tendency will enable evidence-based decision-making to allow resolutions to be made before delays occur. This study presents an innovative approach that uses Cluster Analysis Method to support predictions during Earned Value Analyses. A clustering analysis was used to predict future scheduling, Earned Value Management (EVM), and Earned Schedule (ES) principal Indexes behaviors in construction projects. The analysis was made using a database with 90 different construction projects. It was validated with additional data extracted from literature and with another 15 contrasting projects. For all projects, planned and executed schedules were collected and the EVM and ES principal indexes were calculated. A complete linkage classification method was used. In this way, the cluster analysis made considers that the distance (or similarity) between two clusters must be measured by its most disparate elements, i.e. that the distance is given by the maximum span among its components. Finally, through the use of EVM and ES Indexes and Tukey and Fisher Pairwise Comparisons, the statistical dissimilarity was verified and four clusters were obtained. It can be said that construction projects show an average delay of 35% of its planned completion time. Furthermore, four typical behaviors were found and for each of the obtained clusters, the interim milestones and the necessary rhythms of construction were identified. In general, detected typical behaviors are: (1) Projects that perform a 5% of work advance in the first two tenths and maintain a constant rhythm until completion (greater than 10% for each remaining tenth), being able to finish on the initially estimated time. (2) Projects that start with an adequate construction rate but suffer minor delays culminating with a total delay of almost 27% of the planned time. (3) Projects which start with a performance below the planned rate and end up with an average delay of 64%, and (4) projects that begin with a poor performance, suffer great delays and end up with an average delay of a 120% of the planned completion time. The obtained clusters compose a tool to identify the behavior of new construction projects by comparing their current work performance to the validated database, thus allowing the correction of initial estimations towards more accurate completion schedules.

Keywords: Cluster analysis, construction management, earned value, schedule.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1165
1258 Effects of the Coagulation Bath and Reduction Process on SO2 Adsorption Capacity of Graphene Oxide Fiber

Authors: Özge Alptoğa, Nuray Uçar, Nilgün Karatepe Yavuz, Ayşen Önen

Abstract:

Sulfur dioxide (SO2) is a very toxic air pollutant gas and it causes the greenhouse effect, photochemical smog, and acid rain, which threaten human health severely. Thus, the capture of SO2 gas is very important for the environment. Graphene which is two-dimensional material has excellent mechanical, chemical, thermal properties, and many application areas such as energy storage devices, gas adsorption, sensing devices, and optical electronics. Further, graphene oxide (GO) is examined as a good adsorbent because of its important features such as functional groups (epoxy, carboxyl and hydroxyl) on the surface and layered structure. The SO2 adsorption properties of the fibers are usually investigated on carbon fibers. In this study, potential adsorption capacity of GO fibers was researched. GO dispersion was first obtained with Hummers’ method from graphite, and then GO fibers were obtained via wet spinning process. These fibers were converted into a disc shape, dried, and then subjected to SO2 gas adsorption test. The SO2 gas adsorption capacity of GO fiber discs was investigated in the fields of utilization of different coagulation baths and reduction by hydrazine hydrate. As coagulation baths, single and triple baths were used. In single bath, only ethanol and CaCl2 (calcium chloride) salt were added. In triple bath, each bath has a different concentration of water/ethanol and CaCl2 salt, and the disc obtained from triple bath has been called as reference disk. The fibers which were produced with single bath were flexible and rough, and the analyses show that they had higher SO2 adsorption capacity than triple bath fibers (reference disk). However, the reduction process did not increase the adsorption capacity, because the SEM images showed that the layers and uniform structure in the fiber form were damaged, and reduction decreased the functional groups which SO2 will be attached. Scanning Electron Microscopy (SEM), Fourier Transform Infrared Spectroscopy (FTIR), X-Ray Diffraction (XRD) analyzes were performed on the fibers and discs, and the effects on the results were interpreted. In the future applications of the study, it is aimed that subjects such as pH and additives will be examined.

Keywords: Coagulation bath, graphene oxide fiber, reduction, SO2 gas adsorption.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1152
1257 Surface Topography Assessment Techniques based on an In-process Monitoring Approach of Tool Wear and Cutting Force Signature

Authors: A. M. Alaskari, S. E. Oraby

Abstract:

The quality of a machined surface is becoming more and more important to justify the increasing demands of sophisticated component performance, longevity, and reliability. Usually, any machining operation leaves its own characteristic evidence on the machined surface in the form of finely spaced micro irregularities (surface roughness) left by the associated indeterministic characteristics of the different elements of the system: tool-machineworkpart- cutting parameters. However, one of the most influential sources in machining affecting surface roughness is the instantaneous state of tool edge. The main objective of the current work is to relate the in-process immeasurable cutting edge deformation and surface roughness to a more reliable easy-to-measure force signals using a robust non-linear time-dependent modeling regression techniques. Time-dependent modeling is beneficial when modern machining systems, such as adaptive control techniques are considered, where the state of the machined surface and the health of the cutting edge are monitored, assessed and controlled online using realtime information provided by the variability encountered in the measured force signals. Correlation between wear propagation and roughness variation is developed throughout the different edge lifetimes. The surface roughness is further evaluated in the light of the variation in both the static and the dynamic force signals. Consistent correlation is found between surface roughness variation and tool wear progress within its initial and constant regions. At the first few seconds of cutting, expected and well known trend of the effect of the cutting parameters is observed. Surface roughness is positively influenced by the level of the feed rate and negatively by the cutting speed. As cutting continues, roughness is affected, to different extents, by the rather localized wear modes either on the tool nose or on its flank areas. Moreover, it seems that roughness varies as wear attitude transfers from one mode to another and, in general, it is shown that it is improved as wear increases but with possible corresponding workpart dimensional inaccuracy. The dynamic force signals are found reasonably sensitive to simulate either the progressive or the random modes of tool edge deformation. While the frictional force components, feeding and radial, are found informative regarding progressive wear modes, the vertical (power) components is found more representative carrier to system instability resulting from the edge-s random deformation.

Keywords: Dynamic force signals, surface roughness (finish), tool wear and deformation, tool wear modes (nose, flank)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1330
1256 A Sustainable Design Model by Integrated Evaluation of Closed-loop Design and Supply Chain Using a Mathematical Model

Authors: Yuan-Jye Tseng, Yi-Shiuan Chen

Abstract:

The paper presented a sustainable design model for integrated evaluation of the design and supply chain of a product for the sustainable objectives. To design a product, there can be alternative ways to assign the detailed specifications to fulfill the same design objectives. In the design alternative cases, different material and manufacturing processes with various supply chain activities may be required for the production. Therefore, it is required to evaluate the different design cases based on the sustainable objectives. In this research, a closed-loop design model is developed by integrating the forward design model and reverse design model. From the supply chain point of view, the decisions in the forward design model are connected with the forward supply chain. The decisions in the reverse design model are connected with the reverse supply chain considering the sustainable objectives. The purpose of this research is to develop a mathematical model for analyzing the design cases by integrated evaluating the criteria in the closed-loop design and the closed-loop supply chain. The decision variables are built to represent the design cases of the forward design and reverse design. The cost parameters in a forward design include the costs of material and manufacturing processes. The cost parameters in a reverse design include the costs of recycling, disassembly, reusing, remanufacturing, and disposing. The mathematical model is formulated to minimize the total cost under the design constraints. In practical applications, the decisions of the mathematical model can be used for selecting a design case for the purpose of sustainable design of a product. An example product is demonstrated in the paper. The test result shows that the sustainable design model is useful for integrated evaluation of the design and the supply chain to achieve the sustainable objectives.

Keywords: Closed-loop design, closed-loop supply chain, design evaluation, mathematical model, supply chain management, sustainable design model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1376
1255 Investigation of Combined use of MFCC and LPC Features in Speech Recognition Systems

Authors: К. R. Aida–Zade, C. Ardil, S. S. Rustamov

Abstract:

Statement of the automatic speech recognition problem, the assignment of speech recognition and the application fields are shown in the paper. At the same time as Azerbaijan speech, the establishment principles of speech recognition system and the problems arising in the system are investigated. The computing algorithms of speech features, being the main part of speech recognition system, are analyzed. From this point of view, the determination algorithms of Mel Frequency Cepstral Coefficients (MFCC) and Linear Predictive Coding (LPC) coefficients expressing the basic speech features are developed. Combined use of cepstrals of MFCC and LPC in speech recognition system is suggested to improve the reliability of speech recognition system. To this end, the recognition system is divided into MFCC and LPC-based recognition subsystems. The training and recognition processes are realized in both subsystems separately, and recognition system gets the decision being the same results of each subsystems. This results in decrease of error rate during recognition. The training and recognition processes are realized by artificial neural networks in the automatic speech recognition system. The neural networks are trained by the conjugate gradient method. In the paper the problems observed by the number of speech features at training the neural networks of MFCC and LPC-based speech recognition subsystems are investigated. The variety of results of neural networks trained from different initial points in training process is analyzed. Methodology of combined use of neural networks trained from different initial points in speech recognition system is suggested to improve the reliability of recognition system and increase the recognition quality, and obtained practical results are shown.

Keywords: Speech recognition, cepstral analysis, Voice activation detection algorithm, Mel Frequency Cepstral Coefficients, features of speech, Cepstral Mean Subtraction, neural networks, Linear Predictive Coding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 888
1254 Digital Transformation of Lean Production: Systematic Approach for the Determination of Digitally Pervasive Value Chains

Authors: Peter Burggräf, Matthias Dannapfel, Hanno Voet, Patrick-Benjamin Bök, Jérôme Uelpenich, Julian Hoppe

Abstract:

The increasing digitalization of value chains can help companies to handle rising complexity in their processes and thereby reduce the steadily increasing planning and control effort in order to raise performance limits. Due to technological advances, companies face the challenge of smart value chains for the purpose of improvements in productivity, handling the increasing time and cost pressure and the need of individualized production. Therefore, companies need to ensure quick and flexible decisions to create self-optimizing processes and, consequently, to make their production more efficient. Lean production, as the most commonly used paradigm for complexity reduction, reaches its limits when it comes to variant flexible production and constantly changing market and environmental conditions. To lift performance limits, which are inbuilt in current value chains, new methods and tools must be applied. Digitalization provides the potential to derive these new methods and tools. However, companies lack the experience to harmonize different digital technologies. There is no practicable framework, which instructs the transformation of current value chains into digital pervasive value chains. Current research shows that a connection between lean production and digitalization exists. This link is based on factors such as people, technology and organization. In this paper, the introduced method for the determination of digitally pervasive value chains takes the factors people, technology and organization into account and extends existing approaches by a new dimension. It is the first systematic approach for the digital transformation of lean production and consists of four steps: The first step of ‘target definition’ describes the target situation and defines the depth of the analysis with regards to the inspection area and the level of detail. The second step of ‘analysis of the value chain’ verifies the lean-ability of processes and lies in a special focus on the integration capacity of digital technologies in order to raise the limits of lean production. Furthermore, the ‘digital evaluation process’ ensures the usefulness of digital adaptions regarding their practicability and their integrability into the existing production system. Finally, the method defines actions to be performed based on the evaluation process and in accordance with the target situation. As a result, the validation and optimization of the proposed method in a German company from the electronics industry shows that the digital transformation of current value chains based on lean production achieves a raise of their inbuilt performance limits.

Keywords: Digitalization, digital transformation, lean production, Industrie 4.0, value chain.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2003
1253 Influence of Deficient Materials on the Reliability of Reinforced Concrete Members

Authors: Sami W. Tabsh

Abstract:

The strength of reinforced concrete depends on the member dimensions and material properties. The properties of concrete and steel materials are not constant but random variables. The variability of concrete strength is due to batching errors, variations in mixing, cement quality uncertainties, differences in the degree of compaction and disparity in curing. Similarly, the variability of steel strength is attributed to the manufacturing process, rolling conditions, characteristics of base material, uncertainties in chemical composition, and the microstructure-property relationships. To account for such uncertainties, codes of practice for reinforced concrete design impose resistance factors to ensure structural reliability over the useful life of the structure. In this investigation, the effects of reductions in concrete and reinforcing steel strengths from the nominal values, beyond those accounted for in the structural design codes, on the structural reliability are assessed. The considered limit states are flexure, shear and axial compression based on the ACI 318-11 structural concrete building code. Structural safety is measured in terms of a reliability index. Probabilistic resistance and load models are compiled from the available literature. The study showed that there is a wide variation in the reliability index for reinforced concrete members designed for flexure, shear or axial compression, especially when the live-to-dead load ratio is low. Furthermore, variations in concrete strength have minor effect on the reliability of beams in flexure, moderate effect on the reliability of beams in shear, and sever effect on the reliability of columns in axial compression. On the other hand, changes in steel yield strength have great effect on the reliability of beams in flexure, moderate effect on the reliability of beams in shear, and mild effect on the reliability of columns in axial compression. Based on the outcome, it can be concluded that the reliability of beams is sensitive to changes in the yield strength of the steel reinforcement, whereas the reliability of columns is sensitive to variations in the concrete strength. Since the embedded target reliability in structural design codes results in lower structural safety in beams than in columns, large reductions in material strengths compromise the structural safety of beams much more than they affect columns.

Keywords: Code, flexure, limit states, random variables, reinforced concrete, reliability, reliability index, shear, structural safety.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2553
1252 Life Cycle Datasets for the Ornamental Stone Sector

Authors: Isabella Bianco, Gian Andrea Blengini

Abstract:

The environmental impact related to ornamental stones (such as marbles and granites) is largely debated. Starting from the industrial revolution, continuous improvements of machineries led to a higher exploitation of this natural resource and to a more international interaction between markets. As a consequence, the environmental impact of the extraction and processing of stones has increased. Nevertheless, if compared with other building materials, ornamental stones are generally more durable, natural, and recyclable. From the scientific point of view, studies on stone life cycle sustainability have been carried out, but these are often partial or not very significant because of the high percentage of approximations and assumptions in calculations. This is due to the lack, in life cycle databases (e.g. Ecoinvent, Thinkstep, and ELCD), of datasets about the specific technologies employed in the stone production chain. For example, databases do not contain information about diamond wires, chains or explosives, materials commonly used in quarries and transformation plants. The project presented in this paper aims to populate the life cycle databases with specific data of specific stone processes. To this goal, the methodology follows the standardized approach of Life Cycle Assessment (LCA), according to the requirements of UNI 14040-14044 and to the International Reference Life Cycle Data System (ILCD) Handbook guidelines of the European Commission. The study analyses the processes of the entire production chain (from-cradle-to-gate system boundaries), including the extraction of benches, the cutting of blocks into slabs/tiles and the surface finishing. Primary data have been collected in Italian quarries and transformation plants which use technologies representative of the current state-of-the-art. Since the technologies vary according to the hardness of the stone, the case studies comprehend both soft stones (marbles) and hard stones (gneiss). In particular, data about energy, materials and emissions were collected in marble basins of Carrara and in Beola and Serizzo basins located in the province of Verbano Cusio Ossola. Data were then elaborated through an appropriate software to build a life cycle model. The model was realized setting free parameters that allow an easy adaptation to specific productions. Through this model, the study aims to boost the direct participation of stone companies and encourage the use of LCA tool to assess and improve the stone sector environmental sustainability. At the same time, the realization of accurate Life Cycle Inventory data aims at making available, to researchers and stone experts, ILCD compliant datasets of the most significant processes and technologies related to the ornamental stone sector.

Keywords: LCA datasets, life cycle assessment, ornamental stone, stone environmental impact.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1125
1251 Computer Aided Design Solution Based on Genetic Algorithms for FMEA and Control Plan in Automotive Industry

Authors: Nadia Belu, Laurentiu M. Ionescu, Agnieszka Misztal

Abstract:

In this paper we propose a computer-aided solution with Genetic Algorithms in order to reduce the drafting of reports: FMEA analysis and Control Plan required in the manufacture of the product launch and improved knowledge development teams for future projects. The solution allows to the design team to introduce data entry required to FMEA. The actual analysis is performed using Genetic Algorithms to find optimum between RPN risk factor and cost of production. A feature of Genetic Algorithms is that they are used as a means of finding solutions for multi criteria optimization problems. In our case, along with three specific FMEA risk factors is considered and reduce production cost. Analysis tool will generate final reports for all FMEA processes. The data obtained in FMEA reports are automatically integrated with other entered parameters in Control Plan. Implementation of the solution is in the form of an application running in an intranet on two servers: one containing analysis and plan generation engine and the other containing the database where the initial parameters and results are stored. The results can then be used as starting solutions in the synthesis of other projects. The solution was applied to welding processes, laser cutting and bending to manufacture chassis for buses. Advantages of the solution are efficient elaboration of documents in the current project by automatically generating reports FMEA and Control Plan using multiple criteria optimization of production and build a solid knowledge base for future projects. The solution which we propose is a cheap alternative to other solutions on the market using Open Source tools in implementation.

Keywords: Automotive industry, control plan, FMEA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2856
1250 FEM and Experimental Modal Analysis of Computer Mount

Authors: Vishwajit M. Ghatge, David Looper

Abstract:

Over the last few decades, oilfield service rolling equipment has significantly increased in weight, primarily because of emissions regulations, which require larger/heavier engines, larger cooling systems, and emissions after-treatment systems, in some cases, etc. Larger engines cause more vibration and shock loads, leading to failure of electronics and control systems. If the vibrating frequency of the engine matches the system frequency, high resonance is observed on structural parts and mounts. One such existing automated control equipment system comprising wire rope mounts used for mounting computers was designed approximately 12 years ago. This includes the use of an industrialgrade computer to control the system operation. The original computer had a smaller, lighter enclosure. After a few years, a newer computer version was introduced, which was 10 lbm heavier. Some failures of internal computer parts have been documented for cases in which the old mounts were used. Because of the added weight, there is a possibility of having the two brackets impact each other under off-road conditions, which causes a high shock input to the computer parts. This added failure mode requires validating the existing mount design to suit the new heavy-weight computer. This paper discusses the modal finite element method (FEM) analysis and experimental modal analysis conducted to study the effects of vibration on the wire rope mounts and the computer. The existing mount was modelled in ANSYS software, and resultant mode shapes and frequencies were obtained. The experimental modal analysis was conducted, and actual frequency responses were observed and recorded. Results clearly revealed that at resonance frequency, the brackets were colliding and potentially causing damage to computer parts. To solve this issue, spring mounts of different stiffness were modeled in ANSYS software, and the resonant frequency was determined. Increasing the stiffness of the system increased the resonant frequency zone away from the frequency window at which the engine showed heavy vibrations or resonance. After multiple iterations in ANSYS software, the stiffness of the spring mount was finalized, which was again experimentally validated.

Keywords: Experimental Modal Analysis, FEM Modal Analysis, Frequency, Modal Analysis, Resonance, Vibration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3171
1249 Solving Differential's Equation of Carrier Load on Semiconductor

Authors: Morteza Amirabadi, Vahid Fayaz , Fereshteh Felegary, Hossien Hossienkhani

Abstract:

The most suitable Semiconductor detector, Cadmium Zinc Teloraid , has unique properties because of high Atomic number and wide Brand Gap . It has been tried in this project with different processes such as Lead , Diffusion , Produce and Recombination , effect of Trapping and injection carrier of CdZnTe , to get hole and then present a complete answer of it . Then we should investigate the movement of carrier ( Electron – Hole ) by using above answer.

Keywords: Semiconcuctor detector, Trapping, Recommbination, Diffusion

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1427
1248 Large-Scale Production of High-Performance Fiber-Metal-Laminates by Prepreg-Press-Technology

Authors: Christian Lauter, Corin Reuter, Shuang Wu, Thomas Troester

Abstract:

Lightweight construction became more and more important over the last decades in several applications, e.g. in the automotive or aircraft sector. This is the result of economic and ecological constraints on the one hand and increasing safety and comfort requirements on the other hand. In the field of lightweight design, different approaches are used due to specific requirements towards the technical systems. The use of endless carbon fiber reinforced plastics (CFRP) offers the largest weight saving potential of sometimes more than 50% compared to conventional metal-constructions. However, there are very limited industrial applications because of the cost-intensive manufacturing of the fibers and production technologies. Other disadvantages of pure CFRP-structures affect the quality control or the damage resistance. One approach to meet these challenges is hybrid materials. This means CFRP and sheet metal are combined on a material level. Therefore, new opportunities for innovative process routes are realizable. Hybrid lightweight design results in lower costs due to an optimized material utilization and the possibility to integrate the structures in already existing production processes of automobile manufacturers. In recent and current research, the advantages of two-layered hybrid materials have been pointed out, i.e. the possibility to realize structures with tailored mechanical properties or to divide the curing cycle of the epoxy resin into two steps. Current research work at the Chair for Automotive Lightweight Design (LiA) at the Paderborn University focusses on production processes for fiber-metal-laminates. The aim of this work is the development and qualification of a large-scale production process for high-performance fiber-metal-laminates (FML) for industrial applications in the automotive or aircraft sector. Therefore, the prepreg-press-technology is used, in which pre-impregnated carbon fibers and sheet metals are formed and cured in a closed, heated mold. The investigations focus e.g. on the realization of short process chains and cycle times, on the reduction of time-consuming manual process steps, and the reduction of material costs. This paper gives an overview over the considerable steps of the production process in the beginning. Afterwards experimental results are discussed. This part concentrates on the influence of different process parameters on the mechanical properties, the laminate quality and the identification of process limits. Concluding the advantages of this technology compared to conventional FML-production-processes and other lightweight design approaches are carried out.

Keywords: Composite material, Fiber metal laminate, Lightweight construction, Prepreg press technology, Large-series production.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1861
1247 Communication and Human Resource Management and its Compliance with Culture

Authors: D. Charvatova, C.G. van der Veer

Abstract:

According to the conception of personnel management, human resource management requires efficient use of human resources. This is ensured by various activities directed towards the area of management. Among these activities there are for example the recruitment of employees, development, strengthening of relations, mutual inspiring, implementation of correct working processes and systems used by individuals or groups.

Keywords: Communication, company, customers, employees, human resource management, manager, organizational structure, personnel management, strategic management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1649
1246 Knowledge Management Strategies within a Corporate Environment of Papers

Authors: Daniel J. Glauber

Abstract:

Knowledge transfer between personnel could benefit an organization’s improved competitive advantage in the marketplace from a strategic approach to knowledge management. The lack of information sharing between personnel could create knowledge transfer gaps while restricting the decision-making processes. Knowledge transfer between personnel can potentially improve information sharing based on an implemented knowledge management strategy. An organization’s capacity to gain more knowledge is aligned with the organization’s prior or existing captured knowledge. This case study attempted to understand the overall influence of a KMS within the corporate environment and knowledge exchange between personnel. The significance of this study was to help understand how organizations can improve the Return on Investment (ROI) of a knowledge management strategy within a knowledge-centric organization. A qualitative descriptive case study was the research design selected for this study. The lack of information sharing between personnel may create knowledge transfer gaps while restricting the decision-making processes. Developing a knowledge management strategy acceptable at all levels of the organization requires cooperation in support of a common organizational goal. Working with management and executive members to develop a protocol where knowledge transfer becomes a standard practice in multiple tiers of the organization. The knowledge transfer process could be measurable when focusing on specific elements of the organizational process, including personnel transition to help reduce time required understanding the job. The organization studied in this research acknowledged the need for improved knowledge management activities within the organization to help organize, retain, and distribute information throughout the workforce. Data produced from the study indicate three main themes including information management, organizational culture, and knowledge sharing within the workforce by the participants. These themes indicate a possible connection between an organizations KMS, the organizations culture, knowledge sharing, and knowledge transfer.

Keywords: Knowledge management strategies, knowledge transfer, knowledge management, knowledge capacity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1936
1245 Modeling Aggregation of Insoluble Phase in Reactors

Authors: A. Brener, B. Ismailov, G. Berdalieva

Abstract:

In the paper we submit the modification of kinetic Smoluchowski equation for binary aggregation applying to systems with chemical reactions of first and second orders in which the main product is insoluble. The goal of this work is to create theoretical foundation and engineering procedures for calculating the chemical apparatuses in the conditions of joint course of chemical reactions and processes of aggregation of insoluble dispersed phases which are formed in working zones of the reactor.

Keywords: Binary aggregation, Clusters, Chemical reactions, Insoluble phases.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1455
1244 The Ombudsman: Different Terminologies Same Missions

Authors: Khodr Fakih

Abstract:

The Ombudsman is a procedural mechanism that provides a different approach of dispute resolution. The ombudsman primarily deals with specific grievances from the public against governmental injustice and misconduct. The ombudsman theory is considered an important instrument to any democratic government. This is true since it improves the transparency of the governmental activities in a world in which executive power are rising. Many countries have adopted the concept of Ombudsman but under different terminologies. This paper will provide the different types of Ombudsman and the common activities/processes of fulfilling their mandates.

Keywords: Administration, Citizens, Government, Mediator, Ombudsman, Presidential Mediator.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1813
1243 Information Security Risk Management in IT-Based Process Virtualization: A Methodological Design Based on Action Research

Authors: Jefferson Camacho Mejía, Jenny Paola Forero Pachón, Luis Carlos Gómez Flórez

Abstract:

Action research is a qualitative research methodology, which leads the researcher to delve into the problems of a community in order to understand its needs in depth and finally, to propose actions that lead to a change of social paradigm. Although this methodology had its beginnings in the human sciences, it has attracted increasing interest and acceptance in the field of information systems research since the 1990s. The countless possibilities offered nowadays by the use of Information Technologies (IT) in the development of different socio-economic activities have meant a change of social paradigm and the emergence of the so-called information and knowledge society. According to this, governments, large corporations, small entrepreneurs and in general, organizations of all kinds are using IT to virtualize their processes, taking them from the physical environment to the digital environment. However, there is a potential risk for organizations related with exposing valuable information without an appropriate framework for protecting it. This paper shows progress in the development of a methodological design to manage the information security risks associated with the IT-based processes virtualization, by applying the principles of the action research methodology and it is the result of a systematic review of the scientific literature. This design consists of seven fundamental stages. These are distributed in the three stages described in the action research methodology: 1) Observe, 2) Analyze and 3) Take actions. Finally, this paper aims to offer an alternative tool to traditional information security management methodologies with a view to being applied specifically in the planning stage of IT-based process virtualization in order to foresee risks and to establish security controls before formulating IT solutions in any type of organization.

Keywords: Action research, information security, information technology, methodological design, process virtualization, risk management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 927
1242 Wastewater Treatment in Moving-Bed Biofilm Reactor operated by Flow Reversal Intermittent Aeration System

Authors: B. K. Kim, D. Chang, D. J. Son, D. W. Kim, J. K. Choi, H. J. Yeon, C. Y. Yoon, Y. Fan, S. Y. Lim, K. H. Hong

Abstract:

Intermittent aeration process can be easily applied on the existing activated sludge system and is highly reliable against the loading changes. It can be operated in a relatively simple way as well. Since the moving-bed biofilm reactor method processes pollutants by attaching and securing the microorganisms on the media, the process efficiency can be higher compared to the suspended growth biological treatment process, and can reduce the return of sludge. In this study, the existing intermittent aeration process with alternating flow being applied on the oxidation ditch is applied on the continuous flow stirred tank reactor with advantages from both processes, and we would like to develop the process to significantly reduce the return of sludge in the clarifier and to secure the reliable quality of treated water by adding the moving media. Corresponding process has the appropriate form as an infrastructure based on u- environment in future u- City and is expected to accelerate the implementation of u-Eco city in conjunction with city based services. The system being conducted in a laboratory scale has been operated in HRT 8hours except for the final clarifier and showed the removal efficiency of 97.7 %, 73.1 % and 9.4 % in organic matters, TN and TP, respectively with operating range of 4hour cycle on system SRT 10days. After adding the media, the removal efficiency of phosphorus showed a similar level compared to that before the addition, but the removal efficiency of nitrogen was improved by 7~10 %. In addition, the solids which were maintained in MLSS 1200~1400 at 25 % of media packing were attached all onto the media, which produced no sludge entering the clarifier. Therefore, the return of sludge is not needed any longer.

Keywords: Municipal wastewater treatment, Biological nutrient removal, Alternating flow intermittent aeration system, Reversal flow intermittent aeration system, Moving-bed biofilm reactor, CFSTR, u-City, u-Eco city

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2294
1241 Impact of Quality Assurance Mechanisms on the Work Efficiency of Staff in the Educational Space of Georgia

Authors: B. Gechbaia, K. Goletiani, G. Gabedava, N. Mikeltadze

Abstract:

At this stage, Georgia is a country which is actively involved in the European integration process, for which the primary priority is effective integration in the European education system. The modern Georgian higher education system is the process of establishing a new sociocultural reality, whose main priorities are determined by the Quality System as a continuous cycle of planning, implementation, checking and acting. Obviously, in this situation, the issue of management of education institutions comes out in the foreground, since the proper planning and implementation of personnel management processes is one of the main determinants of the company's performance. At the same time, one of the most important factors is the psychological comfort of the personnel, ensuring their protection and efficiency of stress management policy.

The purpose of this research is to determine how intensely the relationship is between the psychological comfort of the personnel and the efficiency of the quality system in the institution as the quality assurance mechanisms of educational institutions affect the stability of personnel, prevention and management of the stressful situation. The research was carried out within the framework of the Internal Grant Project «The Role of Organizational Culture in the Process of Settlement of Management of Stress and Conflict, Georgian Reality and European Experience » of the Batumi Navigation Teaching University, based on the analysis of the survey results of target groups. The small-scale research conducted by us has revealed that the introduction of quality assurance system and its active implementation increased the quality of management of Georgian educational institutions, increased the level of universal engagement in internal and external processes and as a result, it has improved the quality of education as well as social and psychological comfort indicators of the society.

Keywords: Quality assurance, effective management, stability of personnel, psychological comfort, stress management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1119
1240 Mathematical Modeling for the Processes of Strain Hardening in Heterophase Materials with Nanoparticles

Authors: Mikhail Semenov , Svetlana Kolupaeva, Tatiana Kovalevskaya, Olga Daneyko

Abstract:

An investigation of the process of deformation hardening and evolution of deformation defect medium in dispersion-hardened materials with face centered cubic matrices and nanoparticles was done. Mathematical model including balance equation for the deformation defects was used.

Keywords: deformation defects, dispersion-hardened materials, mathematical modeling, plastic deformation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1466
1239 On Bounding Jayanti's Distributed Mutual Exclusion Algorithm

Authors: Awadhesh Kumar Singh

Abstract:

Jayanti-s algorithm is one of the best known abortable mutual exclusion algorithms. This work is an attempt to overcome an already known limitation of the algorithm while preserving its all important properties and elegance. The limitation is that the token number used to assign process identification number to new incoming processes is unbounded. We have used a suitably adapted alternative data structure, in order to completely eliminate the use of token number, in the algorithm.

Keywords: Abortable, deterministic, local spin, mutual exclusion.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1248
1238 Calibration of 2D and 3D Optical Measuring Instruments in Industrial Environments at Submillimeter Range

Authors: A. Mínguez-Martínez, J. de Vicente

Abstract:

Modern manufacturing processes have led to the miniaturization of systems and, as a result, parts at the micro and nanoscale are produced. This trend seems to become increasingly important in the near future. Besides, as a requirement of Industry 4.0, the digitalization of the models of production and processes makes it very important to ensure that the dimensions of newly manufactured parts meet the specifications of the models. Therefore, it is possible to reduce the scrap and the cost of non-conformities, ensuring the stability of the production at the same time. To ensure the quality of manufactured parts, it becomes necessary to carry out traceable measurements at scales lower than one millimeter. Providing adequate traceability to the SI unit of length (the meter) to 2D and 3D measurements at this scale is a problem that does not have a unique solution in industrial environments. Researchers in the field of dimensional metrology all around the world are working on this issue. A solution for industrial environments, even if it is not complete, will enable working with some traceability. At this point, we believe that the study of the surfaces could provide us with a first approximation to a solution. In this paper, we propose a calibration procedure for the scales of optical measuring instruments, particularizing for a confocal microscope, using material standards easy to find and calibrate in metrology and quality laboratories in industrial environments. Confocal microscopes are measuring instruments capable of filtering the out-of-focus reflected light so that when it reaches the detector, it is possible to take pictures of the part of the surface that is focused. Varying and taking pictures at different Z levels of the focus, a specialized software interpolates between the different planes, and it could reconstruct the surface geometry into a 3D model. As it is easy to deduce, it is necessary to give traceability to each axis. As a complementary result, the roughness Ra parameter will be traced to the reference. Although the solution is designed for a confocal microscope, it may be used for the calibration of other optical measuring instruments, by applying minor changes.

Keywords: Industrial environment, confocal microscope, optical measuring instrument, traceability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 361
1237 Characterization of Acetogenic and Methanogenic Leachates Generated from a Sanitary Landfill Site

Authors: Aik Heng Lee, Hamid Nikraz, Yung Tse Hung

Abstract:

Decomposition processes take place in landfill generate leachates that can be categorized mainly of acetogenic and methanogenic in nature. BOD:COD ratio computed in this study for a landfill site over a 3 years duration revealed as a good indicator to identify acetogenic leachate from methanogenic leachate. Correlation relationships to predict pollutant level taking into consideration of climatic condition are derived.

Keywords: Acetogenic Leachate, Methanogenic Leachate, BOD:COD Ratio.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2195
1236 The Global Children’s Challenge Program: Pedometer Step Count in an Australian School

Authors: D. Hilton

Abstract:

The importance and significance of this research is based upon the fundamental knowledge reported in the scientific literature that physical activity is inversely associated with obesity. In addition, it is recognized there is a global epidemic of sedentariness while at the same time it is known that morbidity and mortality are associated with physical inactivity and as a result of overweight or obesity. Hence this small study in school students is an important area of research in our community. An application submitted in 2005 for the inaugural Public Health Education Research Trust [PHERT] Post Graduate Research Scholarship scheme organized by the Public Health Association of Australia [PHAA] was awarded 3rd place within Australia. The author and title was: D. Hilton, Methods to increase physical activity in school aged children [literature review, a trial using pedometers and a policy paper]. Third place is a good result, however this did not secure funding for the project, as only first place received $5000 funding. Some years later within Australia, a program commenced called the Global Children's Challenge [GCC]. Given details of the 2005 award above were included an application submission prepared for Parkhill Primary School [PPS] which is located in Victoria, Australia was successful. As a result, an excited combined grade 3/ 4 class at the school [27 students] in 2012 became recipients of these free pedometers. Ambassadors for the program were Mrs Catherine Freeman [OAM], Olympic Gold Medalist – Sydney 2000 [400 meters], while another ambassador was Mr Colin Jackson [CBE] who is a Welsh former sprint and hurdling athlete. In terms of PPS and other schools involved in 2012, website details show that the event started on 19th Sep 2012 and students were to wear the pedometer every day for 50 days [at home and at school] aiming for the recommended 15,000 steps/day recording steps taken in a booklet provided. After the finish, an analysis of the average step count for this school showed that the average steps taken / day was 14, 003 [however only a small percentage of students returned the booklets and units] as unfortunately the dates for the program coincided with school holidays so some students either forgot or misplaced the units / booklets. Unfortunately funding for this program ceased in 2013, however the lasting impact of the trial on student’s knowledge and awareness remains and in fact becomes a good grounding for students in how to monitor basic daily physical activity using a method that is easy, fun, low cost and readily accessible.

Keywords: Walking, exercise, physical activity [motor activity].

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1401
1235 Sedimentary Response to Coastal Defense Works in São Vicente Bay, São Paulo

Authors: L. C. Ansanelli, P. Alfredini

Abstract:

The article presents the evaluation of the effectiveness of two groins located at Gonzaguinha and Milionários Beaches, situated on the southeast coast of Brazil. The effectiveness of these coastal defense structures is evaluated in terms of sedimentary dynamics, which is one of the most important environmental processes to be assessed in coastal engineering studies. The applied method is based on the implementation of the Delft3D numerical model system tools. Delft3D-WAVE module was used for waves modelling, Delft3D-FLOW for hydrodynamic modelling and Delft3D-SED for sediment transport modelling. The calibration of the models was carried out in a way that the simulations adequately represent the region studied, evaluating improvements in the model elements with the use of statistical comparisons of similarity between the results and waves, currents and tides data recorded in the study area. Analysis of the maximum wave heights was carried to select the months with higher accumulated energy to implement these conditions in the engineering scenarios. The engineering studies were performed for two scenarios: 1) numerical simulation of the area considering only the two existing groins; 2) conception of breakwaters coupled at the ends of the existing groins, resulting in two “T” shaped structures. The sediment model showed that, for the simulated period, the area is affected by erosive processes and that the existing groins have little effectiveness in defending the coast in question. The implemented T structures showed some effectiveness in protecting the beaches against erosion and provided the recovery of the portion directly covered by it on the Milionários Beach. In order to complement this study, it is suggested the conception of further engineering scenarios that might recover other areas of the studied region.

Keywords: Coastal engineering, coastal erosion, Sao Vicente Bay, Delft3D, coastal engineering works.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 720
1234 Characteristics of Cognitive Functions among Polish Adolescence with Spelling Disorders

Authors: Izabela Pietras

Abstract:

The level of visual abilities, language, memory processes and intellectual functioning development affects the quality of a written text. The following analysis will present the results of diagnostic tests indicating the most common criterion for a group and stating whether a person has been diagnosed with having cognitive developmental level below the group-s average or not.The study-s aim is to determine whether there are specific patterns of cognitive deficits, which can be distinguished among the group of young people with spelling disorders.

Keywords: cognitive deficits, cognitive functions, spellingdisorders

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1369
1233 Critical Assessment of Scoring Schemes for Protein-Protein Docking Predictions

Authors: Dhananjay C. Joshi, Jung-Hsin Lin

Abstract:

Protein-protein interactions (PPI) play a crucial role in many biological processes such as cell signalling, transcription, translation, replication, signal transduction, and drug targeting, etc. Structural information about protein-protein interaction is essential for understanding the molecular mechanisms of these processes. Structures of protein-protein complexes are still difficult to obtain by biophysical methods such as NMR and X-ray crystallography, and therefore protein-protein docking computation is considered an important approach for understanding protein-protein interactions. However, reliable prediction of the protein-protein complexes is still under way. In the past decades, several grid-based docking algorithms based on the Katchalski-Katzir scoring scheme were developed, e.g., FTDock, ZDOCK, HADDOCK, RosettaDock, HEX, etc. However, the success rate of protein-protein docking prediction is still far from ideal. In this work, we first propose a more practical measure for evaluating the success of protein-protein docking predictions,the rate of first success (RFS), which is similar to the concept of mean first passage time (MFPT). Accordingly, we have assessed the ZDOCK bound and unbound benchmarks 2.0 and 3.0. We also createda new benchmark set for protein-protein docking predictions, in which the complexes have experimentally determined binding affinity data. We performed free energy calculation based on the solution of non-linear Poisson-Boltzmann equation (nlPBE) to improve the binding mode prediction. We used the well-studied thebarnase-barstarsystem to validate the parameters for free energy calculations. Besides,thenlPBE-based free energy calculations were conducted for the badly predicted cases by ZDOCK and ZRANK. We found that direct molecular mechanics energetics cannot be used to discriminate the native binding pose from the decoys.Our results indicate that nlPBE-based calculations appeared to be one of the promising approaches for improving the success rate of binding pose predictions.

Keywords: protein-protein docking, protein-protein interaction, molecular mechanics energetics, Poisson-Boltzmann calculations

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1780
1232 Internal Accounting Controls

Authors: Alireza Azimi Sani , Shahram Chaharmahalie

Abstract:

Internal controls of accounting are an essential business function for a growth-oriented organization, and include the elements of risk assessment, information communications and even employees' roles and responsibilities. Internal controls of accounting systems are designed to protect a company from fraud, abuse and inaccurate data recording and help organizations keep track of essential financial activities. Internal controls of accounting provide a streamlined solution for organizing all accounting procedures and ensuring that the accounting cycle is completed consistently and successfully. Implementing a formal Accounting Procedures Manual for the organization allows the financial department to facilitate several processes and maintain rigorous standards. Internal controls also allow organizations to keep detailed records, manage and organize important financial transactions and set a high standard for the organization's financial management structure and protocols. A well-implemented system also reduces the risk of accounting errors and abuse. A well-implemented controls system allows a company's financial managers to regulate and streamline all functions of the accounting department. Internal controls of accounting can be set up for every area to track deposits, monitor check handling, keep track of creditor accounts, and even assess budgets and financial statements on an ongoing basis. Setting up an effective accounting system to monitor accounting reports, analyze records and protect sensitive financial information also can help a company set clear goals and make accurate projections. Creating efficient accounting processes allows an organization to set specific policies and protocols on accounting procedures, and reach its financial objectives on a regular basis. Internal accounting controls can help keep track of such areas as cash-receipt recording, payroll management, appropriate recording of grants and gifts, cash disbursements by authorized personnel, and the recording of assets. These systems also can take into account any government regulations and requirements for financial reporting.

Keywords: Internal controls, risk assessment, financial management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1996