Search results for: process model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 27922

Search results for: process model

24442 Intensification of Heat Transfer in Magnetically Assisted Reactor

Authors: Dawid Sołoducha, Tomasz Borowski, Marian Kordas, Rafał Rakoczy

Abstract:

The magnetic field in the past few years became an important part of many studies. Magnetic field (MF) may be used to affect the process in many ways; for example, it can be used as a factor to stabilize the system. We can use MF to steer the operation, to activate or inhibit the process, or even to affect the vital activity of microorganisms. Using various types of magnetic field generators is always connected with the delivery of some heat to the system. Heat transfer is a very important phenomenon; it can influence the process positively and negatively, so it’s necessary to measure heat stream transferred from the place of generation and prevent negative influence on the operation. The aim of the presented work was to apply various types of magnetic fields and to measure heat transfer phenomena. The results were obtained by continuous measurement at several measuring points with temperature probes. Results were compilated in the form of temperature profiles. The study investigated the undetermined heat transfer in a custom system equipped with a magnetic field generator. Experimental investigations are provided for the explanation of the influence of the various type of magnetic fields on the heat transfer process. The tested processes are described by means of the criteria which defined heat transfer intensification under the action of magnetic field.

Keywords: heat transfer, magnetic field, undetermined heat transfer, temperature profile

Procedia PDF Downloads 184
24441 Using Building Information Modelling to Mitigate Risks Associated with Health and Safety in the Construction and Maintenance of Infrastructure Assets

Authors: Mohammed Muzafar, Darshan Ruikar

Abstract:

BIM, an acronym for Building Information Modelling relates to the practice of creating a computer generated model which is capable of displaying the planning, design, construction and operation of a structure. The resulting simulation is a data-rich, object-oriented, intelligent and parametric digital representation of the facility, from which views and data, appropriate to various users needs can be extracted and analysed to generate information that can be used to make decisions and to improve the process of delivering the facility. BIM also refers to a shift in culture that will influence the way the built environment and infrastructure operates and how it is delivered. One of the main issues of concern in the construction industry at present in the UK is its record on Health & Safety (H&S). It is, therefore, important that new technologies such as BIM are developed to help improve the quality of health and safety. Historically the H&S record of the construction industry in the UK is relatively poor as compared to the manufacturing industries. BIM and the digital environment it operates within now allow us to use design and construction data in a more intelligent way. It allows data generated by the design process to be re-purposed and contribute to improving efficiencies in other areas of a project. This evolutionary step in design is not only creating exciting opportunities for the designers themselves but it is also creating opportunity for every stakeholder in any given project. From designers, engineers, contractors through to H&S managers, BIM is accelerating a cultural change. The paper introduces the concept behind a research project that mitigates the H&S risks associated with the construction, operation and maintenance of assets through the adoption of BIM.

Keywords: building information modeling, BIM levels, health, safety, integration

Procedia PDF Downloads 232
24440 The Role of ICT for Income Inequality: The Model and the Simulations

Authors: Shoji Katagiri

Abstract:

This paper is to clarify the relationship between ICT and income inequality. To do so, we develop the general equilibrium model with ICT investment, obtain the equilibrium solutions, and then simulate the model with these solutions for some OECD countries. As a result, generally, during the corresponding periods we confirm that the relationship between ICT investment and income inequality is positive. In this mode, the increment of the ratio of ICT investment to the aggregated investment in stock enhances the capital’s share of income, and finally leads to income inequality such as the increase of the share of the top decile income. Although we confirm the positive relationship between ICT investment and income inequality, the upward trend for that relationship depends on the values of parameters for the making use of the simulations and these parameters are not deterministic in the magnitudes on the calculated results for the simulations.

Keywords: ICT, inequality, capital accumulation, technology

Procedia PDF Downloads 209
24439 Integrating Virtual Reality and Building Information Model-Based Quantity Takeoffs for Supporting Construction Management

Authors: Chin-Yu Lin, Kun-Chi Wang, Shih-Hsu Wang, Wei-Chih Wang

Abstract:

A construction superintendent needs to know not only the amount of quantities of cost items or materials completed to develop a daily report or calculate the daily progress (earned value) in each day, but also the amount of quantities of materials (e.g., reinforced steel and concrete) to be ordered (or moved into the jobsite) for performing the in-progress or ready-to-start construction activities (e.g., erection of reinforced steel and concrete pouring). These daily construction management tasks require great effort in extracting accurate quantities in a short time (usually must be completed right before getting off work every day). As a result, most superintendents can only provide these quantity data based on either what they see on the site (high inaccuracy) or the extraction of quantities from two-dimension (2D) construction drawings (high time consumption). Hence, the current practice of providing the amount of quantity data completed in each day needs improvement in terms of more accuracy and efficiency. Recently, a three-dimension (3D)-based building information model (BIM) technique has been widely applied to support construction quantity takeoffs (QTO) process. The capability of virtual reality (VR) allows to view a building from the first person's viewpoint. Thus, this study proposes an innovative system by integrating VR (using 'Unity') and BIM (using 'Revit') to extract quantities to support the above daily construction management tasks. The use of VR allows a system user to be present in a virtual building to more objectively assess the construction progress in the office. This VR- and BIM-based system is also facilitated by an integrated database (consisting of the information and data associated with the BIM model, QTO, and costs). In each day, a superintendent can work through a BIM-based virtual building to quickly identify (via a developed VR shooting function) the building components (or objects) that are in-progress or finished in the jobsite. And he then specifies a percentage (e.g., 20%, 50% or 100%) of completion of each identified building object based on his observation on the jobsite. Next, the system will generate the completed quantities that day by multiplying the specified percentage by the full quantities of the cost items (or materials) associated with the identified object. A building construction project located in northern Taiwan is used as a case study to test the benefits (i.e., accuracy and efficiency) of the proposed system in quantity extraction for supporting the development of daily reports and the orders of construction materials.

Keywords: building information model, construction management, quantity takeoffs, virtual reality

Procedia PDF Downloads 120
24438 Optical and Double Folding Analysis for 6Li+16O Elastic Scattering

Authors: Abd Elrahman Elgamala, N. Darwish, I. Bondouk, Sh. Hamada

Abstract:

Available experimental angular distributions for 6Li elastically scattered from 16O nucleus in the energy range 13.0–50.0 MeV are investigated and reanalyzed using optical model of the conventional phenomenological potential and also using double folding optical model of different interaction models: DDM3Y1, CDM3Y1, CDM3Y2, and CDM3Y3. All the involved models of interaction are of M3Y Paris except DDM3Y1 which is of M3Y Reid and the main difference between them lies in the different values for the parameters of the incorporated density distribution function F(ρ). We have extracted the renormalization factor NR for 6Li+16O nuclear system in the energy range 13.0–50.0 MeV using the aforementioned interaction models.

Keywords: elastic scattering, optical model, folding potential, density distribution

Procedia PDF Downloads 134
24437 Test of Capital Account Monetary Model of Floating Exchange Rate Determination: Further Evidence from Selected African Countries

Authors: Oloyede John Adebayo

Abstract:

This paper tested a variant of the monetary model of exchange rate determination, called Frankel’s Capital Account Monetary Model (CAAM) based on Real Interest Rate Differential, on the floating exchange rate experiences of three developing countries of Africa; viz: Ghana, Nigeria and the Gambia. The study adopted the Auto regressive Instrumental Package (AIV) and Almon Polynomial Lag Procedure of regression analysis based on the assumption that the coefficients follow a third-order Polynomial with zero-end constraint. The results found some support for the CAAM hypothesis that exchange rate responds proportionately to changes in money supply, inversely to income and positively to interest rates and expected inflation differentials. On this basis, the study points the attention of monetary authorities and researchers to the relevance and usefulness of CAAM as appropriate tool and useful benchmark for analyzing the exchange rate behaviour of most developing countries.

Keywords: exchange rate, monetary model, interest differentials, capital account

Procedia PDF Downloads 393
24436 Prediction of Phonon Thermal Conductivity of F.C.C. Al by Molecular Dynamics Simulation

Authors: Leila Momenzadeh, Alexander V. Evteev, Elena V. Levchenko, Tanvir Ahmed, Irina Belova, Graeme Murch

Abstract:

In this work, the phonon thermal conductivity of f.c.c. Al is investigated in detail in the temperature range 100 – 900 K within the framework of equilibrium molecular dynamics simulations making use of the Green-Kubo formalism and one of the most reliable embedded-atom method potentials. It is found that the heat current auto-correlation function of the f.c.c. Al model demonstrates a two-stage temporal decay similar to the previously observed for f.c.c Cu model. After the first stage of decay, the heat current auto-correlation function of the f.c.c. Al model demonstrates a peak in the temperature range 100-800 K. The intensity of the peak decreases as the temperature increases. At 900 K, it transforms to a shoulder. To describe the observed two-stage decay of the heat current auto-correlation function of the f.c.c. Al model, we employ decomposition model recently developed for phonon-mediated thermal transport in a monoatomic lattice. We found that the electronic contribution to the total thermal conductivity of f.c.c. Al dominates over the whole studied temperature range. However, the phonon contribution to the total thermal conductivity of f.c.c. Al increases as temperature decreases. It is about 1.05% at 900 K and about 12.5% at 100 K.

Keywords: aluminum, gGreen-Kubo formalism, molecular dynamics, phonon thermal conductivity

Procedia PDF Downloads 403
24435 A Comparison of Neural Network and DOE-Regression Analysis for Predicting Resource Consumption of Manufacturing Processes

Authors: Frank Kuebler, Rolf Steinhilper

Abstract:

Artificial neural networks (ANN) as well as Design of Experiments (DOE) based regression analysis (RA) are mainly used for modeling of complex systems. Both methodologies are commonly applied in process and quality control of manufacturing processes. Due to the fact that resource efficiency has become a critical concern for manufacturing companies, these models needs to be extended to predict resource-consumption of manufacturing processes. This paper describes an approach to use neural networks as well as DOE based regression analysis for predicting resource consumption of manufacturing processes and gives a comparison of the achievable results based on an industrial case study of a turning process.

Keywords: artificial neural network, design of experiments, regression analysis, resource efficiency, manufacturing process

Procedia PDF Downloads 508
24434 Productivity-Emotiveness Model of School Students’ Capacity Levels

Authors: Ivan Samokhin

Abstract:

A new two-factor model of school students’ capacity levels is proposed. It considers the academic productivity and emotional condition of children taking part in the study process. Each basic level reflects the correlation of these two factors. The teacher decides whether the required result is achieved or not and write down the grade (from 'A' to 'F') in the register. During the term, the teacher can estimate the students’ progress with any intervals, but it is not desirable to exceed a two-week period (with primary school being an exception). Each boy or girl should have a special notebook to record the emotions which they feel studying a subject. The children can make their notes the way they like it – for example, using a ten-point scale or a short verbal description. It is recommended to record the emotions twice a day: after the lesson and after doing the homework. Before the students start doing this, they should be instructed by a school psychologist, who has to emphasize that an attitude to the subject – not to a person in charge of it – is relevant. At the end of the term, the notebooks are given to the teacher, who is now able to make preliminary conclusions about academic results and psychological comfort of each student. If necessary, some pedagogical measures can be taken. The data about a supposed capacity level is available for the teacher and the school administration. In certain cases, this information can be also revealed to the student’s parents, while the student learns it only after receiving a school-leaving certificate (until this moment, the results are not considered ultimate). Then a person may take these data into consideration when choosing his/her future area of higher education. We single out four main capacity levels: 'nominally low', 'inclination', 'ability' and 'gift'.

Keywords: academic productivity, capacity level, emotional condition, school students

Procedia PDF Downloads 212
24433 Port Logistics Integration: Challenges and Approaches: Case ‎Study; Iranian Seaports

Authors: Ali Alavi, Hong-Oanh Nguyen, ‎Jiangang Fei, Jafar Sayareh

Abstract:

The recent competitive market in the port sector highly depend on logistics practices, functions ‎and activities and seaports play a key role in port logistics chains. Despite the well-articulated importance of ports and terminals in integrated logistics, the role of success factors in port logistics integration has been rarely mentioned‎. The objective of this paper is to ‎fill this gap in the literature and provide an insight into how seaports and terminals may improve their logistics integration. First, a literature review of studies on logistics integration in seaports and terminals is conducted. Second, a new conceptual framework for port logistics integration is proposed to incorporate the role of the new variables emerging from the recent developments in the global business environment. Third, the model tested in Iranian port and maritime sector using self-administered and online survey among logistics chain actors in Iranian seaports such shipping line operators, logistics service providers, port authorities, logistics companies and other related actors. The results have found the logistics process and operations, information integration, ‎value-added services, and logistics practices being influential to logistics integration. A proposed conceptual framework is developed to extend the existing ‎framework and incorporates the variables namely organizational activities, resource ‎sharing, and institutional support.‎ Further examination of the proposed model across multiple contexts is necessary for the validity of the findings. The framework could be more detailed on each factor and consider actors perspective.

Keywords: maritime logistics‎, port integration‎, logistics integration‎, supply chain integration

Procedia PDF Downloads 220
24432 A Study for Area-level Mosquito Abundance Prediction by Using Supervised Machine Learning Point-level Predictor

Authors: Theoktisti Makridou, Konstantinos Tsaprailis, George Arvanitakis, Charalampos Kontoes

Abstract:

In the literature, the data-driven approaches for mosquito abundance prediction relaying on supervised machine learning models that get trained with historical in-situ measurements. The counterpart of this approach is once the model gets trained on pointlevel (specific x,y coordinates) measurements, the predictions of the model refer again to point-level. These point-level predictions reduce the applicability of those solutions once a lot of early warning and mitigation actions applications need predictions for an area level, such as a municipality, village, etc... In this study, we apply a data-driven predictive model, which relies on public-open satellite Earth Observation and geospatial data and gets trained with historical point-level in-Situ measurements of mosquito abundance. Then we propose a methodology to extract information from a point-level predictive model to a broader area-level prediction. Our methodology relies on the randomly spatial sampling of the area of interest (similar to the Poisson hardcore process), obtaining the EO and geomorphological information for each sample, doing the point-wise prediction for each sample, and aggregating the predictions to represent the average mosquito abundance of the area. We quantify the performance of the transformation from the pointlevel to the area-level predictions, and we analyze it in order to understand which parameters have a positive or negative impact on it. The goal of this study is to propose a methodology that predicts the mosquito abundance of a given area by relying on point-level prediction and to provide qualitative insights regarding the expected performance of the area-level prediction. We applied our methodology to historical data (of Culex pipiens) of two areas of interest (Veneto region of Italy and Central Macedonia of Greece). In both cases, the results were consistent. The mean mosquito abundance of a given area can be estimated with similar accuracy to the point-level predictor, sometimes even better. The density of the samples that we use to represent one area has a positive effect on the performance in contrast to the actual number of sampling points which is not informative at all regarding the performance without the size of the area. Additionally, we saw that the distance between the sampling points and the real in-situ measurements that were used for training did not strongly affect the performance.

Keywords: mosquito abundance, supervised machine learning, culex pipiens, spatial sampling, west nile virus, earth observation data

Procedia PDF Downloads 127
24431 Structural Damage Detection Using Modal Data Employing Teaching Learning Based Optimization

Authors: Subhajit Das, Nirjhar Dhang

Abstract:

Structural damage detection is a challenging work in the field of structural health monitoring (SHM). The damage detection methods mainly focused on the determination of the location and severity of the damage. Model updating is a well known method to locate and quantify the damage. In this method, an error function is defined in terms of difference between the signal measured from ‘experiment’ and signal obtained from undamaged finite element model. This error function is minimised with a proper algorithm, and the finite element model is updated accordingly to match the measured response. Thus, the damage location and severity can be identified from the updated model. In this paper, an error function is defined in terms of modal data viz. frequencies and modal assurance criteria (MAC). MAC is derived from Eigen vectors. This error function is minimized by teaching-learning-based optimization (TLBO) algorithm, and the finite element model is updated accordingly to locate and quantify the damage. Damage is introduced in the model by reduction of stiffness of the structural member. The ‘experimental’ data is simulated by the finite element modelling. The error due to experimental measurement is introduced in the synthetic ‘experimental’ data by adding random noise, which follows Gaussian distribution. The efficiency and robustness of this method are explained through three examples e.g., one truss, one beam and one frame problem. The result shows that TLBO algorithm is efficient to detect the damage location as well as the severity of damage using modal data.

Keywords: damage detection, finite element model updating, modal assurance criteria, structural health monitoring, teaching learning based optimization

Procedia PDF Downloads 203
24430 Simulation of Die Casting Process in an Industrial Helical Gearbox Flange Die

Authors: Mehdi Modabberifar, Behrouz Raad, Bahman Mirzakhani

Abstract:

Flanges are widely used for connecting valves, pipes and other industrial devices such as gearboxes. Method of producing a flange has a considerable impact on the manner of their involvement with the industrial engines and gearboxes. By Using die casting instead of sand casting and machining for manufacturing flanges, production speed and dimensional accuracy of the parts increases. Also, in die casting, obtained dimensions are close to final dimensions and hence the need for machining flanges after die casting process decreases which makes a significant savings in raw materials and improves the mechanical properties of flanges. In this paper, a typical die of an industrial helical gearbox flange (size ISO 50) was designed and die casting process for producing this type of flange was simulated using ProCAST software. The results of simulation were used for optimizing die design. Finally, using the results of the analysis, optimized die was built.

Keywords: die casting, finite element, flange, helical gearbox

Procedia PDF Downloads 353
24429 Coastalization and Urban Sprawl in the Mediterranean: Using High-Resolution Multi-Temporal Data to Identify Typologies of Spatial Development

Authors: Apostolos Lagarias, Anastasia Stratigea

Abstract:

Coastal urbanization is heavily affecting the Mediterranean, taking the form of linear urban sprawl along the coastal zone. This process is posing extreme pressure on ecosystems, leading to an unsustainable model of growth. The aim of this research is to analyze coastal urbanization patterns in the Mediterranean using High-resolution multi-temporal data provided by the Global Human Settlement Layer (GHSL) database. Methodology involves the estimation of a set of spatial metrics characterizing the density, aggregation/clustering and dispersion of built-up areas. As case study areas, the Spanish Coast and the Adriatic Italian Coast are examined. Coastalization profiles are examined and selected sub-areas massively affected by tourism development and suburbanization trends (Costa Blanca/Murcia, Costa del Sol, Puglia, Emilia-Romagna Coast) are analyzed and compared. Results show that there are considerable differences between the Spanish and the Italian typologies of spatial development, related to the land use structure and planning policies applied in each case. Monitoring and analyzing spatial patterns could inform integrated Mediterranean strategies for coastal areas and redirect spatial/environmental policies towards a more sustainable model of growth

Keywords: coastalization, Mediterranean, multi-temporal, urban sprawl, spatial metrics

Procedia PDF Downloads 120
24428 Microwave Dielectric Relaxation Study of Diethanolamine with Triethanolamine from 10 MHz-20 GHz

Authors: A. V. Patil

Abstract:

The microwave dielectric relaxation study of diethanolamine with triethanolamine binary mixture have been determined over the frequency range of 10 MHz to 20 GHz, at various temperatures using time domain reflectometry (TDR) method for 11 concentrations of the system. The present work reveals molecular interaction between same multi-functional groups [−OH and –NH2] of the alkanolamines (diethanolamine and triethanolamine) using different models such as Debye model, Excess model, and Kirkwood model. The dielectric parameters viz. static dielectric constant (ε0) and relaxation time (τ) have been obtained with Debye equation characterized by a single relaxation time without relaxation time distribution by the least squares fit method.

Keywords: diethanolamine, excess properties, kirkwood properties, time domain reflectometry, triethanolamine

Procedia PDF Downloads 285
24427 Business Process Management and Organizational Culture in Big Companies: Cross-Country Analysis

Authors: Dalia Suša Vugec

Abstract:

Business process management (BPM) is widely used approach focused on designing, mapping, changing, managing and analyzing business processes of an organization, which eventually leads to better performance and derives many other benefits. Since every organization strives to improve its performance in order to be sustainable and to remain competitive on the market in long-term period, numerous organizations are nowadays adopting and implementing BPM. However, not all organizations are equally successful in that. One of the ways of measuring BPM success is by measuring its maturity by calculating Process Performance Index (PPI) using ten BPM success factors. Still, although BPM is a holistic concept, organizational culture is not taken into consideration in calculating PPI. Hence, aim of this paper is twofold; first, it aims to explore and analyze the current state of BPM success factors within the big organizations from Slovenia, Croatia, and Austria and second, it aims to analyze the structure of organizational culture within the observed companies, focusing on the link with BPM success factors as well. The presented study is based on the results of the questionnaire conducted as the part of the PROSPER project (IP-2014-09-3729) and financed by Croatian Science Foundation. The results of the questionnaire reveal differences in the achieved levels of BPM success factors and therefore BPM maturity in total between the three observed countries. Moreover, the structure of organizational culture across three countries also differs. This paper discusses the revealed differences between countries as well as the link between organizational culture and BPM success factors.

Keywords: business process management, BPM maturity, BPM success factors, organizational culture, process performance index

Procedia PDF Downloads 105
24426 Performance Evaluation of the Classic seq2seq Model versus a Proposed Semi-supervised Long Short-Term Memory Autoencoder for Time Series Data Forecasting

Authors: Aswathi Thrivikraman, S. Advaith

Abstract:

The study is aimed at designing encoders for deciphering intricacies in time series data by redescribing the dynamics operating on a lower-dimensional manifold. A semi-supervised LSTM autoencoder is devised and investigated to see if the latent representation of the time series data can better forecast the data. End-to-end training of the LSTM autoencoder, together with another LSTM network that is connected to the latent space, forces the hidden states of the encoder to represent the most meaningful latent variables relevant for forecasting. Furthermore, the study compares the predictions with those of a traditional seq2seq model.

Keywords: LSTM, autoencoder, forecasting, seq2seq model

Procedia PDF Downloads 137
24425 Hierarchical Tree Long Short-Term Memory for Sentence Representations

Authors: Xiuying Wang, Changliang Li, Bo Xu

Abstract:

A fixed-length feature vector is required for many machine learning algorithms in NLP field. Word embeddings have been very successful at learning lexical information. However, they cannot capture the compositional meaning of sentences, which prevents them from a deeper understanding of language. In this paper, we introduce a novel hierarchical tree long short-term memory (HTLSTM) model that learns vector representations for sentences of arbitrary syntactic type and length. We propose to split one sentence into three hierarchies: short phrase, long phrase and full sentence level. The HTLSTM model gives our algorithm the potential to fully consider the hierarchical information and long-term dependencies of language. We design the experiments on both English and Chinese corpus to evaluate our model on sentiment analysis task. And the results show that our model outperforms several existing state of the art approaches significantly.

Keywords: deep learning, hierarchical tree long short-term memory, sentence representation, sentiment analysis

Procedia PDF Downloads 339
24424 'Gender' and 'Gender Equalities': Conceptual Issues

Authors: Moustafa Ali

Abstract:

The aim of this paper is to discuss and question some of the widely accepted concepts within the conceptual framework of gender from terminological, scientific, and Muslim cultural perspectives, and to introduce a new definition and a model of gender in the Arab and Muslim societies. This paper, therefore, uses a generic methodology and document analysis and comes in three sections and a conclusion. The first section discusses some of the terminological issues in the conceptual framework of gender. The second section highlights scientific issues, introduces a definition and a model of gender, whereas the third section offers Muslim cultural perspectives on some issues related to gender in the Muslim world. The paper, then, concludes with findings and recommendations reached so far.

Keywords: gender definition, gender equalities, sex-gender separability, fairness-based model of gender

Procedia PDF Downloads 120
24423 A Mathematical Programming Model for Lot Sizing and Production Planning in Multi-Product Companies: A Case Study of Azar Battery Company

Authors: Farzad Jafarpour Taher, Maghsud Solimanpur

Abstract:

Production planning is one of the complex tasks in multi-product firms that produce a wide range of products. Since resources in mass production companies are limited and different products use common resources, there must be a careful plan so that firms can respond to customer needs efficiently. Azar-battery Company is a firm that provides twenty types of products for its customers. Therefore, careful planning must be performed in this company. In this research, the current conditions of Azar-battery Company were investigated to provide a mathematical programming model to determine the optimum production rate of the products in this company. The production system of this company is multi-stage, multi-product and multi-period. This system is studied in terms of a one-year planning horizon regarding the capacity of machines and warehouse space limitation. The problem has been modeled as a linear programming model with deterministic demand in which shortage is not allowed. The objective function of this model is to minimize costs (including raw materials, assembly stage, energy costs, packaging, and holding). Finally, this model has been solved by Lingo software using the branch and bound approach. Since the computation time was very long, the solver interrupted, and the obtained feasible solution was used for comparison. The proposed model's solution costs have been compared to the company’s real data. This non-optimal solution reduces the total production costs of the company by about %35.

Keywords: multi-period, multi-product production, multi-stage, production planning

Procedia PDF Downloads 75
24422 Microwave Sanitization of Polyester Fabrics

Authors: K. Haggag, M. Salama, H. El-Sayed

Abstract:

Polyester fabrics were sanitized by exposing them to vaporized water under the influence of conventional heating or microwave irradiation. Hydrogen peroxide was added the humid sanitizing environment as a disinfectant. The said sanitization process was found to be effective towards two types of bacteria, namely Escherichia coli ATCC 2666 (G –ve) and Staphylococcus aureus ATCC 6538 (G +ve). The effect of the sanitization process on some of the inherent properties of polyester fabrics was monitored.

Keywords: polyester, fabric, sanitization, microwave, bacteria

Procedia PDF Downloads 356
24421 Estimation of World Steel Production by Process

Authors: Reina Kawase

Abstract:

World GHG emissions should be reduced 50% by 2050 compared with 1990 level. CO2 emission reduction from steel sector, an energy-intensive sector, is essential. To estimate CO2 emission from steel sector in the world, estimation of steel production is required. The world steel production by process is estimated during the period of 2005-2050. The world is divided into aggregated 35 regions. For a steel making process, two kinds of processes are considered; basic oxygen furnace (BOF) and electric arc furnace (EAF). Steel production by process in each region is decided based on a current production capacity, supply-demand balance of steel and scrap, technology innovation of steel making, steel consumption projection, and goods trade. World steel production under moderate countermeasure scenario in 2050 increases by 1.3 times compared with that in 2012. When domestic scrap recycling is promoted, steel production in developed regions increases about 1.5 times. The share in developed regions changes from 34 %(2012) to about 40%(2050). This is because developed regions are main suppliers of scrap. 48-57% of world steel production is produced by EAF. Under the scenario which thinks much of supply-demand balance of steel, steel production in developing regions increases is 1.4 times and is larger than that in developed regions. The share in developing regions, however, is not so different from current level. The increase in steel production by EAF is the largest under the scenario in which supply-demand balance of steel is an important factor. The share reaches 65%.

Keywords: global steel production, production distribution scenario, steel making process, supply-demand balance

Procedia PDF Downloads 432
24420 Simple Multiple-Attribute Rating Technique for Optimal Decision-Making Model on Selecting Best Spiker of World Grand Prix

Authors: Chen Chih-Cheng, Chen I-Cheng, Lee Yung-Tan, Kuo Yen-Whea, Yu Chin-Hung

Abstract:

The purpose of this study is to construct a model for best spike player selection in a top volleyball tournament of the world. Data consisted of the records of 2013 World Grand Prix declared by International Volleyball Federation (FIVB). Simple Multiple-Attribute Rating Technique (SMART) was used for optimal decision-making model on the best spike player selection. The research results showed that the best spike player ranking by SMART is different than the ranking by FIVB. The results demonstrated the effectiveness and feasibility of the proposed model.

Keywords: simple multiple-attribute rating technique, World Grand Prix, best spike player, International Volleyball Federation

Procedia PDF Downloads 458
24419 A Study on the Effect of the Work-Family Conflict on Work Engagement: A Mediated Moderation Model of Emotional Exhaustion and Positive Psychology Capital

Authors: Sungeun Hyun, Sooin Lee, Gyewan Moon

Abstract:

Work-Family Conflict has been an active research area for the past decades. Work-Family Conflict harms individuals and organizations, it is ultimately expected to bring the cost of losses to the company in the long run. WFC has mainly focused on effects of organizational effectiveness and job attitude such as Job Satisfaction, Organizational Commitment, and Turnover Intention variables. This study is different from consequence variable with previous research. For this purpose, we selected the positive job attitude 'Work Engagement' as a consequence of WFC. This research has its primary research purpose in identifying the negative effects of the Work-Family Conflict, and started out from the recognition of the problem that the research on the direct relationship on the influence of the WFC on Work Engagement is lacking. Based on the COR(Conservation of resource theory) and JD-R(Job Demand- Resource model), the empirical study model to examine the negative effects of WFC with Emotional Exhaustion as the link between WFC and Work Engagement was suggested and validated. Also, it was analyzed how much Positive Psychological Capital may buffer the negative effects arising from WFC within this relationship, and the Mediated Moderation model controlling the indirect effect influencing the Work Engagement by the Positive Psychological Capital mediated by the WFC and Emotional Exhaustion was verified. Data was collected by using questionnaires distributed to 500 employees engaged manufacturing, services, finance, IT industry, education services, and other sectors, of which 389 were used in the statistical analysis. The data are analyzed by statistical package, SPSS 21.0, SPSS macro and AMOS 21.0. The hierarchical regression analysis, SPSS PROCESS macro and Bootstrapping method for hypothesis testing were conducted. Results showed that all hypotheses are supported. First, WFC showed a negative effect on Work Engagement. Specifically, WIF appeared to be on more negative effects than FIW. Second, Emotional exhaustion found to mediate the relationship between WFC and Work Engagement. Third, Positive Psychological Capital showed to moderate the relationship between WFC and Emotional Exhaustion. Fourth, the effect of mediated moderation through the integration verification, Positive Psychological Capital demonstrated to buffer the relationship among WFC, Emotional Exhastion, and Work Engagement. Also, WIF showed a more negative effects than FIW through verification of all hypotheses. Finally, we discussed the theoretical and practical implications on research and management of the WFC, and proposed limitations and future research directions of research.

Keywords: emotional exhaustion, positive psychological capital, work engagement, work-family conflict

Procedia PDF Downloads 204
24418 An Experimental Study on Some Conventional and Hybrid Models of Fuzzy Clustering

Authors: Jeugert Kujtila, Kristi Hoxhalli, Ramazan Dalipi, Erjon Cota, Ardit Murati, Erind Bedalli

Abstract:

Clustering is a versatile instrument in the analysis of collections of data providing insights of the underlying structures of the dataset and enhancing the modeling capabilities. The fuzzy approach to the clustering problem increases the flexibility involving the concept of partial memberships (some value in the continuous interval [0, 1]) of the instances in the clusters. Several fuzzy clustering algorithms have been devised like FCM, Gustafson-Kessel, Gath-Geva, kernel-based FCM, PCM etc. Each of these algorithms has its own advantages and drawbacks, so none of these algorithms would be able to perform superiorly in all datasets. In this paper we will experimentally compare FCM, GK, GG algorithm and a hybrid two-stage fuzzy clustering model combining the FCM and Gath-Geva algorithms. Firstly we will theoretically dis-cuss the advantages and drawbacks for each of these algorithms and we will describe the hybrid clustering model exploiting the advantages and diminishing the drawbacks of each algorithm. Secondly we will experimentally compare the accuracy of the hybrid model by applying it on several benchmark and synthetic datasets.

Keywords: fuzzy clustering, fuzzy c-means algorithm (FCM), Gustafson-Kessel algorithm, hybrid clustering model

Procedia PDF Downloads 497
24417 Electroforming of 3D Digital Light Processing Printed Sculptures Used as a Low Cost Option for Microcasting

Authors: Cecile Meier, Drago Diaz Aleman, Itahisa Perez Conesa, Jose Luis Saorin Perez, Jorge De La Torre Cantero

Abstract:

In this work, two ways of creating small-sized metal sculptures are proposed: the first by means of microcasting and the second by electroforming from models printed in 3D using an FDM (Fused Deposition Modeling‎) printer or using a DLP (Digital Light Processing) printer. It is viable to replace the wax in the processes of the artistic foundry with 3D printed objects. In this technique, the digital models are manufactured with resin using a low-cost 3D FDM printer in polylactic acid (PLA). This material is used, because its properties make it a viable substitute to wax, within the processes of artistic casting with the technique of lost wax through Ceramic Shell casting. This technique consists of covering a sculpture of wax or in this case PLA with several layers of thermoresistant material. This material is heated to melt the PLA, obtaining an empty mold that is later filled with the molten metal. It is verified that the PLA models reduce the cost and time compared with the hand modeling of the wax. In addition, one can manufacture parts with 3D printing that are not possible to create with manual techniques. However, the sculptures created with this technique have a size limit. The problem is that when printed pieces with PLA are very small, they lose detail, and the laminar texture hides the shape of the piece. DLP type printer allows obtaining more detailed and smaller pieces than the FDM. Such small models are quite difficult and complex to melt using the lost wax technique of Ceramic Shell casting. But, as an alternative, there are microcasting and electroforming, which are specialized in creating small metal pieces such as jewelry ones. The microcasting is a variant of the lost wax that consists of introducing the model in a cylinder in which the refractory material is also poured. The molds are heated in an oven to melt the model and cook them. Finally, the metal is poured into the still hot cylinders that rotate in a machine at high speed to properly distribute all the metal. Because microcasting requires expensive material and machinery to melt a piece of metal, electroforming is an alternative for this process. The electroforming uses models in different materials; for this study, micro-sculptures printed in 3D are used. These are subjected to an electroforming bath that covers the pieces with a very thin layer of metal. This work will investigate the recommended size to use 3D printers, both with PLA and resin and first tests are being done to validate use the electroforming process of microsculptures, which are printed in resin using a DLP printer.

Keywords: sculptures, DLP 3D printer, microcasting, electroforming, fused deposition modeling

Procedia PDF Downloads 120
24416 Special Plea That The Prosecutor Does Not Have Title To Prosecute

Authors: Wium de Villiers

Abstract:

Section 106(1)(h) of the South African Criminal Procedure Act 51 of 1977 provides that an accused may enter a special plea that the prosecutor does not have title to prosecute. In a seminal matter (S v Mousa 2021 2 SACR 378 (GJ)) regarding section 106(1)(h), certain interesting legal aspects emerged. The first aspect concerned the meaning of the term “prosecutor”. More specifically, the question arose whether the term included a prosecutor who was previously involved with the matter, as well as the relevant Deputy Director of Public Prosecutions (DDPP) who instituted the prosecution and oversaw the prosecution on behalf of the state. The meaning of the term “title”, and with regard to the conduct of the “prosecutor”, the term “abuse of process,” were also raised and decided. In the paper, the facts, and the arguments in, and the decisions of the court, are discussed critically. The author argue that the intended objection in section 106(1)(h) is not to cure the abuse inflicted by a previous prosecutor or by the DDPP. I point out that the term “title” includes a lack of authority, non-compliance with jurisdictional requirements or absence of locus standi. I also point out that an abuse of process takes place if the process is used for an improper, ulterior or collateral purpose. I also argue that the accused should, instead of relying on section 106(1)(h), have relied on the prior agreement and applied for a permanent stay of prosecution.

Keywords: special plea, prosecutor, title, abuse of process

Procedia PDF Downloads 39
24415 Pharmaceutical Scale up for Solid Dosage Forms

Authors: A. Shashank Tiwari, S. P. Mahapatra

Abstract:

Scale-up is defined as the process of increasing batch size. Scale-up of a process viewed as a procedure for applying the same process to different output volumes. There is a subtle difference between these two definitions: batch size enlargement does not always translate into a size increase of the processing volume. In mixing applications, scale-up is indeed concerned with increasing the linear dimensions from the laboratory to the plant size. On the other hand, processes exist (e.g., tableting) where the term ‘scale-up’ simply means enlarging the output by increasing the speed. To complete the picture, one should point out special procedures where an increase of the scale is counterproductive and ‘scale-down’ is required to improve the quality of the product. In moving from Research and Development (R&D) to production scale, it is sometimes essential to have an intermediate batch scale. This is achieved at the so-called pilot scale, which is defined as the manufacturing of drug product by a procedure fully representative of and simulating that used for full manufacturing scale. This scale also makes it possible to produce enough products for clinical testing and to manufacture samples for marketing. However, inserting an intermediate step between R&D and production scales does not, in itself, guarantee a smooth transition. A well-defined process may generate a perfect product both in the laboratory and the pilot plant and then fail quality assurance tests in production.

Keywords: scale up, research, size, batch

Procedia PDF Downloads 395
24414 Systematic Examination of Methods Supporting the Social Innovation Process

Authors: Mariann Veresne Somosi, Zoltan Nagy, Krisztina Varga

Abstract:

Innovation is the key element of economic development and a key factor in social processes. Technical innovations can be identified as prerequisites and causes of social change and cannot be created without the renewal of society. The study of social innovation can be characterised as one of the significant research areas of our day. The study’s aim is to identify the process of social innovation, which can be defined by input, transformation, and output factors. This approach divides the social innovation process into three parts: situation analysis, implementation, follow-up. The methods associated with each stage of the process are illustrated by the chronological line of social innovation. In this study, we have sought to present methodologies that support long- and short-term decision-making that is easy to apply, have different complementary content, and are well visualised for different user groups. When applying the methods, the reference objects are different: county, district, settlement, specific organisation. The solution proposed by the study supports the development of a methodological combination adapted to different situations. Having reviewed metric and conceptualisation issues, we wanted to develop a methodological combination along with a change management logic suitable for structured support to the generation of social innovation in the case of a locality or a specific organisation. In addition to a theoretical summary, in the second part of the study, we want to give a non-exhaustive picture of the two counties located in the north-eastern part of Hungary through specific analyses and case descriptions.

Keywords: factors of social innovation, methodological combination, social innovation process, supporting decision-making

Procedia PDF Downloads 140
24413 Interactive Glare Visualization Model for an Architectural Space

Authors: Florina Dutt, Subhajit Das, Matthew Swartz

Abstract:

Lighting design and its impact on indoor comfort conditions are an integral part of good interior design. Impact of lighting in an interior space is manifold and it involves many sub components like glare, color, tone, luminance, control, energy efficiency, flexibility etc. While other components have been researched and discussed multiple times, this paper discusses the research done to understand the glare component from an artificial lighting source in an indoor space. Consequently, the paper discusses a parametric model to convey real time glare level in an interior space to the designer/ architect. Our end users are architects and likewise for them it is of utmost importance to know what impression the proposed lighting arrangement and proposed furniture layout will have on indoor comfort quality. This involves specially those furniture elements (or surfaces) which strongly reflect light around the space. Essentially, the designer needs to know the ramification of the ‘discomfortable glare’ at the early stage of design cycle, when he still can afford to make changes to his proposed design and consider different routes of solution for his client. Unfortunately, most of the lighting analysis tools that are present, offer rigorous computation and analysis on the back end eventually making it challenging for the designer to analyze and know the glare from interior light quickly. Moreover, many of them do not focus on glare aspect of the artificial light. That is why, in this paper, we explain a novel approach to approximate interior glare data. Adding to that we visualize this data in a color coded format, expressing the implications of their proposed interior design layout. We focus on making this analysis process very fluid and fast computationally, enabling complete user interaction with the capability to vary different ranges of user inputs adding more degrees of freedom for the user. We test our proposed parametric model on a case study, a Computer Lab space in our college facility.

Keywords: computational geometry, glare impact in interior space, info visualization, parametric lighting analysis

Procedia PDF Downloads 338