Search results for: Construction process
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6319

Search results for: Construction process

5419 Pattern Recognition Using Feature Based Die-Map Clusteringin the Semiconductor Manufacturing Process

Authors: Seung Hwan Park, Cheng-Sool Park, Jun Seok Kim, Youngji Yoo, Daewoong An, Jun-Geol Baek

Abstract:

Depending on the big data analysis becomes important, yield prediction using data from the semiconductor process is essential. In general, yield prediction and analysis of the causes of the failure are closely related. The purpose of this study is to analyze pattern affects the final test results using a die map based clustering. Many researches have been conducted using die data from the semiconductor test process. However, analysis has limitation as the test data is less directly related to the final test results. Therefore, this study proposes a framework for analysis through clustering using more detailed data than existing die data. This study consists of three phases. In the first phase, die map is created through fail bit data in each sub-area of die. In the second phase, clustering using map data is performed. And the third stage is to find patterns that affect final test result. Finally, the proposed three steps are applied to actual industrial data and experimental results showed the potential field application.

Keywords: Die-Map Clustering, Feature Extraction, Pattern Recognition, Semiconductor Manufacturing Process.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3124
5418 European and International Bond Markets Integration

Authors: Dimitris Georgoutsos, Petros M. Migiakis

Abstract:

The concurrent era is characterised by strengthened interactions among financial markets and increased capital mobility globally. In this frames we examine the effects the international financial integration process has on the European bond markets. We perform a comparative study of the interactions of the European and international bond markets and exploit Cointegration analysis results on the elimination of stochastic trends and the decomposition of the underlying long run equilibria and short run causal relations. Our investigation provides evidence on the relation between the European integration process and that of globalisation, viewed through the bond markets- sector. Additionally the structural formulation applied, offers significant implications of the findings. All in all our analysis offers a number of answers on crucial queries towards the European bond markets integration process.

Keywords: financial integration, bond markets, cointegration

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1805
5417 The Effect of Solution Density on the Synthesis of Magnesium Borate from Boron-Gypsum

Authors: N. Tugrul, E. Sariburun, F. T. Senberber, A. S. Kipcak, E. Moroydor Derun, S. Piskin

Abstract:

Boron-gypsum is a waste which occurs in the boric acid production process. In this study, the boron content of this waste is evaluated for the use in synthesis of magnesium borates and such evaluation of this kind of waste is useful more than storage or disposal. Magnesium borates, which are a sub-class of boron minerals, are useful additive materials for the industries due to their remarkable thermal and mechanical properties. Magnesium borates were obtained hydrothermally at different temperatures. Novelty of this study is the search of the solution density effects to magnesium borate synthesis process for the increasing the possibility of borongypsum usage as a raw material. After the synthesis process, products are subjected to XRD and FT-IR to identify and characterize their crystal structure, respectively.

Keywords: Boron-gypsum, hydrothermal synthesis, magnesium borate, solution density.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2138
5416 E-Voting: A Trustworthiness In Democratic; A View from Technology, Political and Social Issue

Authors: Sera Syarmila Sameon, Rohaini Ramli

Abstract:

A trustworthy voting process in democratic is important that each vote is recorded with accuracy and impartiality. The accuracy and impartiality are tallied in high rate with biometric system. One of the sign is a fingerprint. Fingerprint recognition is still a challenging problem, because of the distortions among the different impression of the same finger. Because of the trustworthy of biometric voting technologies, it may give a great effect on numbers of voter-s participation and outcomes of the democratic process. Hence in this study, the authors are interested in designing and analyzing the Electronic Voting System and the participation of the users. The system is based on the fingerprint minutiae with the addition of person ID number. This is in order to enhance the accuracy and speed of the voting process. The new design is analyzed by conducting pilot election among a class of students for selecting their representative.

Keywords: Biometric, FAR and FRR, democratic, voting

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1553
5415 Establishment and Evaluation of Information System for Chemotherapy Care

Authors: Yi-Ting Liu, Pei-Ying Wen

Abstract:

In order to improve the overall safety of chemotherapy, safety-protecting netwas established for the whole process from prescribing by physicians, transcribing by nurses, dispensing by pharmacists to administering by nurses. The information system was used to check and monitorwhole process of administration and related sheets were computerized to simplify the paperwork.

Keywords: Chemotherapy, Bar Code Medication Administration (BCMA), Medication Safety.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1849
5414 Shear Behaviour of RC Deep Beams with Openings Strengthened with Carbon Fiber Reinforced Polymer

Authors: Mannal Tariq

Abstract:

Construction industry is making progress at a high pace. The trend of the world is getting more biased towards the high rise buildings. Deep beams are one of the most common elements in modern construction having small span to depth ratio. Deep beams are mostly used as transfer girders. This experimental study consists of 16 reinforced concrete (RC) deep beams. These beams were divided into two groups; A and B. Groups A and B consist of eight beams each, having 381 mm (15 in) and 457 mm (18 in) depth respectively. Each group was further subdivided into four sub groups each consisting of two identical beams. Each subgroup was comprised of solid/control beam (without opening), opening above neutral axis (NA), at NA and below NA. Except for control beams, all beams with openings were strengthened with carbon fibre reinforced polymer (CFRP) vertical strips. These eight groups differ from each other based on depth and location of openings. For testing sake, all beams have been loaded with two symmetrical point loads. All beams have been designed based on strut and tie model concept. The outcome of experimental investigation elaborates the difference in the shear behaviour of deep beams based on depth and location of circular openings variation. 457 mm (18 in) deep beam with openings above NA show the highest strength and 381 mm (15 in) deep beam with openings below NA show the least strength. CFRP sheets played a vital role in increasing the shear capacity of beams.

Keywords: CFRP, deep beams, openings in deep beams, strut and tie model, shear behaviour.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1335
5413 Studies on Lucrative Process Layout for Medium Scale Industries

Authors: Balamurugan Baladhandapani, Ganesh Renganathan, V. R. Sanal Kumar

Abstract:

In this paper a comprehensive review on various factory layouts has been carried out for designing a lucrative process layout for medium scale industries. Industry data base reveals that the end product rejection rate is on the order of 10% amounting large profit loss. In order to avoid these rejection rates and to increase the quality product production an intermediate non-destructive testing facility (INDTF) has been recommended for increasing the overall profit. We observed through detailed case studies that while introducing INDTF to medium scale industries the expensive production process can be avoided to the defective products well before its final shape. Additionally, the defective products identified during the intermediate stage can be effectively utilized for other applications or recycling; thereby the overall wastage of the raw materials can be reduced and profit can be increased. We concluded that the prudent design of a factory layout through critical path method facilitating with INDTF will warrant profitable outcome.

Keywords: Intermediate Non-destructive testing, Medium scale industries, Process layout design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2375
5412 Hybrid Structure Learning Approach for Assessing the Phosphate Laundries Impact

Authors: Emna Benmohamed, Hela Ltifi, Mounir Ben Ayed

Abstract:

Bayesian Network (BN) is one of the most efficient classification methods. It is widely used in several fields (i.e., medical diagnostics, risk analysis, bioinformatics research). The BN is defined as a probabilistic graphical model that represents a formalism for reasoning under uncertainty. This classification method has a high-performance rate in the extraction of new knowledge from data. The construction of this model consists of two phases for structure learning and parameter learning. For solving this problem, the K2 algorithm is one of the representative data-driven algorithms, which is based on score and search approach. In addition, the integration of the expert's knowledge in the structure learning process allows the obtainment of the highest accuracy. In this paper, we propose a hybrid approach combining the improvement of the K2 algorithm called K2 algorithm for Parents and Children search (K2PC) and the expert-driven method for learning the structure of BN. The evaluation of the experimental results, using the well-known benchmarks, proves that our K2PC algorithm has better performance in terms of correct structure detection. The real application of our model shows its efficiency in the analysis of the phosphate laundry effluents' impact on the watershed in the Gafsa area (southwestern Tunisia).

Keywords: Classification, Bayesian network; structure learning, K2 algorithm, expert knowledge, surface water analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 488
5411 Asymmetric Tukey’s Control Chart Robust to Skew and Non-Skew Process Observation

Authors: S. Sukparungsee

Abstract:

In reality, the process observations are away from the assumption that are normal distributed. The observations could be skew distributions which should use an asymmetric chart rather than symmetric chart. Consequently, this research aim to study the robustness of the asymmetric Tukey’s control chart for skew and non-skew distributions as Lognormal and Laplace distributions. Furthermore, the performances in detecting of a change in parameter of asymmetric and symmetric Tukey’s control charts are compared by Average ARL (AARL). The results found that the asymmetric performs better than symmetric Tukey’s control chart for both cases of skew and non-skew process observation.

Keywords: Asymmetric control limit, average of average run length, Tukey’s control chart and skew distributions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2470
5410 Nugget Formation during Resistance Spot Welding using Finite Element Model

Authors: Jawad Saleem, Abdul Majid, Kent Bertilsson, Torbjörn Carlberg, Nazar Ul Islam

Abstract:

Resistance spot welding process comprises of electric, thermal and mechanical phenomenon, which makes this process complex and highly non-linear and thus, it becomes difficult to model it. In order to obtain good weld nugget during spot welding, hit and trial welds are usually done which is very costly. Therefore the numerical simulation research has been conducted to understand the whole process. In this paper three different cases were analyzed by varying the tip contact area and it was observed that, with the variation of tip contact area the nugget formation at the faying surface is affected. The tip contact area of the welding electrode becomes large with long welding cycles. Therefore in order to maintain consistency of nugget formation during the welding process, the current compensation in control feedback is required. If the contact area of the welding electrode tip is reduced, a large amount of current flows through the faying surface, as a result of which sputtering occurs.

Keywords: Resistance spot welding, Finite element modeling, Nugget formation, Welding electrode, Numerical method simulation,

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3769
5409 Improvement of GVPI Insulation System Characteristics by Curing Process Modification

Authors: M. Shadmand

Abstract:

The curing process of insulation system for electrical machines plays a determinative role for its durability and reliability. Polar structure of insulating resin molecules and used filler of insulation system can be taken as an occasion to leverage it to enhance overall characteristics of insulation system, mechanically and electrically. The curing process regime for insulating system plays an important role for its mechanical and electrical characteristics by arranging the polymerization of chain structure for resin. In this research, the effect of electrical field application on in-curing insulating system for Global Vacuum Pressurized Impregnation (GVPI) system for traction motor was considered by performing the dissipation factor, polarization and de-polarization current (PDC) and voltage endurance (aging) measurements on sample test objects. Outcome results depicted obvious improvement in mechanical strength of the insulation system as well as higher electrical characteristics with routing and long-time (aging) electrical tests. Coming together, polarization of insulation system during curing process would enhance the machine life time. 

Keywords: Insulation system, GVPI, PDC, aging.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1058
5408 Optimization of Control Parameters for EWR in Injection Flushing Type of EDM on Stainless Steel 304 Workpiece

Authors: M. S. Reza, M. Hamdi, S. H. Tomadi, A. R. Ismail

Abstract:

The operating control parameters of injection flushing type of electrical discharge machining process on stainless steel 304 workpiece using copper tools are being optimized according to its individual machining characteristic i.e. Electrode Wear Ratio (EWR). Higher EWR would give bad dimensional precision for the EDM machined workpiece because of high electrode wear. Hence, the quality characteristic for EWR is set to lower-the-better to achieve the optimum dimensional precision for the machined workpiece. Taguchi method has been used for the construction, layout and analysis of the experiment for EWR machining characteristic. The use of Taguchi method in the experiment saves a lot of time and cost of preparing and machining the experiment samples. Therefore, an L18 Orthogonal array which was the fundamental component in the statistical design of experiments has been used to plan the experiments and Analysis of Variance (ANOVA) is used to determine the optimum machining parameters for this machining characteristic. The control parameters selected for this optimization experiments are polarity, pulse on duration, discharge current, discharge voltage, machining depth, machining diameter and dielectric liquid pressure. The result had shown that negative polarity machining parameter setting will decreases EWR.

Keywords: ANOVA, EDM, Injection Flushing, L18Orthogonal Array, EWR, Stainless Steel 304

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1831
5407 The Forensic Swing of Things: The Current Legal and Technical Challenges of IoT Forensics

Authors: Pantaleon Lutta, Mohamed Sedky, Mohamed Hassan

Abstract:

The inability of organizations to put in place management control measures for Internet of Things (IoT) complexities persists to be a risk concern. Policy makers have been left to scamper in finding measures to combat these security and privacy concerns. IoT forensics is a cumbersome process as there is no standardization of the IoT products, no or limited historical data are stored on the devices. This paper highlights why IoT forensics is a unique adventure and brought out the legal challenges encountered in the investigation process. A quadrant model is presented to study the conflicting aspects in IoT forensics. The model analyses the effectiveness of forensic investigation process versus the admissibility of the evidence integrity; taking into account the user privacy and the providers’ compliance with the laws and regulations. Our analysis concludes that a semi-automated forensic process using machine learning, could eliminate the human factor from the profiling and surveillance processes, and hence resolves the issues of data protection (privacy and confidentiality).

Keywords: Cloud forensics, data protection laws, GDPR, IoT forensics, machine learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1058
5406 The Concept of an Agile Enterprise Research Model

Authors: Maja Sajdak

Abstract:

The aim of this paper is to present the concept of an agile enterprise model and to initiate discussion on the research assumptions of the model presented. The implementation of the research project "The agility of enterprises in the process of adapting to the environment and its changes" began in August 2014 and is planned to last three years. The article has the form of a work-inprogress paper which aims to verify and initiate a debate over the proposed research model. In the literature there are very few publications relating to research into agility; it can be concluded that the most controversial issue in this regard is the method of measuring agility. In previous studies the operationalization of agility was often fragmentary, focusing only on selected areas of agility, for example manufacturing, or analysing only selected sectors. As a result the measures created to date can only be treated as contributory to the development of precise measurement tools. This research project aims to fill a cognitive gap in the literature with regard to the conceptualization and operationalization of an agile company. Thus, the original contribution of the author of this project is the construction of a theoretical model that integrates manufacturing agility (consisting mainly in adaptation to the environment) and strategic agility (based on proactive measures). The author of this research project is primarily interested in the attributes of an agile enterprise which indicate that the company is able to rapidly adapt to changing circumstances and behave pro-actively.

Keywords: Agile company, acuity, entrepreneurship, flexibility, research model, strategic leadership.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2471
5405 The Study of the Discrete Risk Model with Random Income

Authors: Peichen Zhao

Abstract:

In this paper, we extend the compound binomial model to the case where the premium income process, based on a binomial process, is no longer a linear function. First, a mathematically recursive formula is derived for non ruin probability, and then, we examine the expected discounted penalty function, satisfy a defect renewal equation. Third, the asymptotic estimate for the expected discounted penalty function is then given. Finally, we give two examples of ruin quantities to illustrate applications of the recursive formula and the asymptotic estimate for penalty function.

Keywords: Discounted penalty function, compound binomial process, recursive formula, discrete renewal equation, asymptotic estimate.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1405
5404 A Panel Cointegration Analysis for Macroeconomic Determinants of International Housing Market

Authors: Mei-Se Chien, Chien-Chiang Lee, Sin-Jie Cai

Abstract:

The main purpose of this paper is to investigate thelong-run equilibrium and short-run dynamics of international housing prices when macroeconomic variables change. We apply the Pedroni’s, panel cointegration, using the unbalanced panel data analysis of 33 countries over the period from 1980Q1 to 2013Q1, to examine the relationships among house prices and macroeconomic variables. Our empirical results of panel data cointegration tests support the existence of a cointegration among these macroeconomic variables and house prices. Besides, the empirical results of panel DOLS further present that a 1% increase in economic activity, long-term interest rates, and construction costs cause house prices to respectively change 2.16%, -0.04%, and 0.22% in the long run.Furthermore, the increasing economic activity and the construction cost would cause strongerimpacts on the house prices for lower income countries than higher income countries.The results lead to the conclusion that policy of house prices growth can be regarded as economic growth for lower income countries. Finally, in America region, the coefficient of economic activity is the highest, which displays that increasing economic activity causes a faster rise in house prices there than in other regions. There are some special cases whereby the coefficients of interest rates are significantly positive in America and Asia regions.

Keywords: House prices, Macroeconomic Variables, Panel cointegration, Dynamic OLS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5147
5403 XML Data Management in Compressed Relational Database

Authors: Hongzhi Wang, Jianzhong Li, Hong Gao

Abstract:

XML is an important standard of data exchange and representation. As a mature database system, using relational database to support XML data may bring some advantages. But storing XML in relational database has obvious redundancy that wastes disk space, bandwidth and disk I/O when querying XML data. For the efficiency of storage and query XML, it is necessary to use compressed XML data in relational database. In this paper, a compressed relational database technology supporting XML data is presented. Original relational storage structure is adaptive to XPath query process. The compression method keeps this feature. Besides traditional relational database techniques, additional query process technologies on compressed relations and for special structure for XML are presented. In this paper, technologies for XQuery process in compressed relational database are presented..

Keywords: XML, compression, query processing

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1785
5402 Solving Weighted Number of Operation Plus Processing Time Due-Date Assignment, Weighted Scheduling and Process Planning Integration Problem Using Genetic and Simulated Annealing Search Methods

Authors: Halil Ibrahim Demir, Caner Erden, Mumtaz Ipek, Ozer Uygun

Abstract:

Traditionally, the three important manufacturing functions, which are process planning, scheduling and due-date assignment, are performed separately and sequentially. For couple of decades, hundreds of studies are done on integrated process planning and scheduling problems and numerous researches are performed on scheduling with due date assignment problem, but unfortunately the integration of these three important functions are not adequately addressed. Here, the integration of these three important functions is studied by using genetic, random-genetic hybrid, simulated annealing, random-simulated annealing hybrid and random search techniques. As well, the importance of the integration of these three functions and the power of meta-heuristics and of hybrid heuristics are studied.

Keywords: Process planning, weighted scheduling, weighted due-date assignment, genetic search, simulated annealing, hybrid meta-heuristics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1576
5401 Design of Seismically Resistant Tree-Branching Steel Frames Using Theory and Design Guides for Eccentrically Braced Frames

Authors: R. Gary Black, Abolhassan Astaneh-Asl

Abstract:

The International Building Code (IBC) and the  California Building Code (CBC) both recognize four basic types of  steel seismic resistant frames; moment frames, concentrically braced  frames, shear walls and eccentrically braced frames. Based on  specified geometries and detailing, the seismic performance of these  steel frames is well understood. In 2011, the authors designed an  innovative steel braced frame system with tapering members in the  general shape of a branching tree as a seismic retrofit solution to an  existing four story “lift-slab” building. Located in the seismically  active San Francisco Bay Area of California, a frame of this  configuration, not covered by the governing codes, would typically  require model or full scale testing to obtain jurisdiction approval.  This paper describes how the theories, protocols, and code  requirements of eccentrically braced frames (EBFs) were employed  to satisfy the 2009 International Building Code (IBC) and the 2010  California Building Code (CBC) for seismically resistant steel frames  and permit construction of these nonconforming geometries.

 

Keywords: Eccentrically Braced Frame, Lift Slab Construction, Seismic Retrofit, Shear Link, Steel Design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2575
5400 Sustainable Development of Medium Strength Concrete Using Polypropylene as Aggregate Replacement

Authors: Reza Keihani, Ali Bahadori-Jahromi, Timothy James Clacy

Abstract:

Plastic as an environmental burden is a well-rehearsed topic in the research area. This is due to its global demand and destructive impacts on the environment, which has been a significant concern to the governments. Typically, the use of plastic in the construction industry is seen across low-density, non-structural applications due to its diverse range of benefits including high strength-to-weight ratios, manipulability and durability. It can be said that with the level of plastic consumption experienced in the construction industry, an ongoing responsibility is shown for this sector to continually innovate alternatives for application of recycled plastic waste such as using plastic made replacement from polyethylene, polystyrene, polyvinyl and polypropylene in the concrete mix design. In this study, the impact of partially replaced fine aggregate with polypropylene in the concrete mix design was investigated to evaluate the concrete’s compressive strength by conducting an experimental work which comprises of six concrete mix batches with polypropylene replacements ranging from 0.5 to 3.0%. The results demonstrated a typical decline in the compressive strength with the addition of plastic aggregate, despite this reduction generally mitigated as the level of plastic in the concrete mix increased. Furthermore, two of the six plastic-containing concrete mixes tested in the current study exceeded the ST5 standardised prescribed concrete mix compressive strength requirement at 28-days containing 1.50% and 2.50% plastic aggregates, which demonstrated the potential for use of recycled polypropylene in structural applications, as a partial by mass, fine aggregate replacement in the concrete mix.

Keywords: Compressive strength, concrete, polypropylene, sustainability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 918
5399 Considering the Effect of Semi-Rigid Connection in Steel Frame Structures for Progressive Collapse

Authors: Fooad Karimi Ghaleh Jough, Mohsen Soori

Abstract:

Today, the occurrence of progressive failure in structures has become a challenging issue, requiring the presentation of suitable solutions for structural resistance to this phenomenon. It is also necessary to evaluate the vulnerability of existing and under-construction buildings to progressive failure. The kind of lateral load-resisting system the building and its connections have is one of the most significant and influential variables in structural resistance to the risk of progressing failure. Using the "Alternative Path" approach suggested by the GSA2003 and UFC2013 recommendations, different configurations of semi-rigid connections against progressive failure are offered in this study. In order to do this, the Opensees program was used to model nine distinct semi-rigid connection configurations on a three-story Special Area of Conservation (SAC) structure, accounting for the impact of connection stiffness. Then, using nonlinear dynamic analysis, the effects of column removal were explored in two scenarios: corner column removal and middle column removal on the first level. Nonlinear static analysis results showed that when a column is removed, structures with semi-rigid connections experience larger displacements, which result in the construction of a plastic hinge. Furthermore, it was clear from the findings of the nonlinear static analysis that the possibility of progressive failure increased with the number of semi-rigid connections in the structure.

Keywords: Semi-rigid, nonlinear static analysis, progressive collapse, alternative path.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 55
5398 QR Technology to Automate Health Condition Detection Payment System: A Case Study in Schools of the Kingdom of Saudi Arabia

Authors: Amjad Alsulami, Farah Albishri, Kholod Alzubidi, Lama Almehemadi, Salma Elhag

Abstract:

Food allergy is a common and rising problem among children. Many students have their first allergic reaction at school, one of these is anaphylaxis, which can be fatal. This study discovered that several schools' processes lacked safety regulations and information on how to handle allergy issues and chronic diseases like diabetes where students were not supervised or monitored during the cafeteria purchasing process. Academic institutions have no obvious prevention or effort when purchasing food containing allergens or negatively impacting the health status of students who suffer from chronic diseases. The stability of students' health must be maintained because it greatly affects their performance and educational achievement. To address this issue, this paper uses a business reengineering process to propose the automation of the whole food-purchasing process, which will aid in detecting and avoiding allergic occurrences and preventing any side effects from eating foods that are conflicting with students' health. This may be achieved by designing a smart card with an embedded QR code that reveals which foods cause an allergic reaction in a student. A survey was distributed to determine and examine how the cafeteria will handle allergic children and whether any management or policy is applied in the school. Also, the survey findings indicate that the integration of QR technology into the food purchasing process would improve health condition detection. The family supported that the suggested solution would be advantageous because it ensured their children avoided eating not allowed food. Moreover, by analyzing and simulating the as-is process and the suggested process, the results demonstrate that there is an improvement in quality and time.

Keywords: QR code, smart card, food allergies, Business Process reengineering, health condition detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 235
5397 Treatment of Oily Wastewater by Fibrous Coalescer Process: Stage Coalescer and Model Prediction

Authors: Pisut Painmanakul, Kotchakorn Kongkangwarn, Nattawin Chawaloesphonsiya

Abstract:

The coalescer process is one of the methods for oily water treatment by increasing the oil droplet size in order to enhance the separating velocity and thus effective separation. However, the presence of surfactants in an oily emulsion can limit the obtained mechanisms due to the small oil size related with stabilized emulsion. In this regard, the purpose of this research is to improve the efficiency of the coalescer process for treating the stabilized emulsion. The effects of bed types, bed height, liquid flow rate and stage coalescer (step-bed) on the treatment efficiencies in term of COD values were studied. Note that the treatment efficiency obtained experimentally was estimated by using the COD values and oil droplet size distribution. The study has shown that the plastic media has more effective to attach with oil particles than the stainless one due to their hydrophobic properties. Furthermore, the suitable bed height (3.5 cm) and step bed (3.5 cm with 2 steps) were necessary in order to well obtain the coalescer performance. The application of step bed coalescer process in reactor has provided the higher treatment efficiencies in term of COD removal than those obtained with classical process. The proposed model for predicting the area under curve and thus treatment efficiency, based on the single collector efficiency (ηT) and the attachment efficiency (α), provides relatively a good coincidence between the experimental and predicted values of treatment efficiencies in this study.

Keywords: Stage coalescer, stabilized emulsions, treatment efficiency, model prediction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2182
5396 RDFGraph: New Data Modeling Tool for Semantic Web

Authors: Daniel Siahaan, Aditya Prapanca

Abstract:

The emerging Semantic Web has been attracted many researchers and developers. New applications have been developed on top of Semantic Web and many supporting tools introduced to improve its software development process. Metadata modeling is one of development process where supporting tools exists. The existing tools are lack of readability and easiness for a domain knowledge expert to graphically models a problem in semantic model. In this paper, a metadata modeling tool called RDFGraph is proposed. This tool is meant to solve those problems. RDFGraph is also designed to work with modern database management systems that support RDF and to improve the performance of the query execution process. The testing result shows that the rules used in RDFGraph follows the W3C standard and the graphical model produced in this tool is properly translated and correct.

Keywords: CASE tool, data modeling, semantic web

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2076
5395 A Refined Energy-Based Model for Friction-Stir Welding

Authors: Samir A. Emam, Ali El Domiaty

Abstract:

Friction-stir welding has received a huge interest in the last few years. The many advantages of this promising process have led researchers to present different theoretical and experimental explanation of the process. The way to quantitatively and qualitatively control the different parameters of the friction-stir welding process has not been paved. In this study, a refined energybased model that estimates the energy generated due to friction and plastic deformation is presented. The effect of the plastic deformation at low energy levels is significant and hence a scale factor is introduced to control its effect. The predicted heat energy and the obtained maximum temperature using our model are compared to the theoretical and experimental results available in the literature and a good agreement is obtained. The model is applied to AA6000 and AA7000 series.

Keywords: Friction-stir welding, Energy, Aluminum Alloys.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1727
5394 The Adoption of Process Management for Accounting Information Systems in Thailand

Authors: Manirath Wongsim, Pawornprat Hongsakon

Abstract:

Information Quality (IQ) has become a critical, strategic issue in Accounting Information Systems (AIS) adoption. In order to implement AIS adoption successfully, it is important to consider the quality of information use throughout the adoption process, which seriously impacts the effectiveness of AIS adoption practice and the optimisation of AIS adoption decisions. There is a growing need for research to provide insights into issues and solutions related to IQ in AIS adoption. The need for an integrated approach to improve IQ in AIS adoption, as well as the unique characteristics of accounting data, demands an AIS adoption specific IQ framework. This research aims to explore ways of managing information quality and AIS adoption to investigate the relationship between the IQ issues and AIS adoption process. This study has led to the development of a framework for understanding IQ management in AIS adoption. This research was done on 44 respondents as ten organisations from manufacturing firms in Thailand. The findings of the research’s empirical evidence suggest that IQ dimensions in AIS adoption to provide assistance in all process of decision making. This research provides empirical evidence that information quality of AIS adoption affect decision making and suggests that these variables should be considered in adopting AIS in order to improve the effectiveness of AIS.

Keywords: Information quality, information quality dimensions, accounting information systems, accounting Information system adoption.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3042
5393 The Coupling of Photocatalytic Oxidation Processes with Activated Carbon Technologies and the Comparison of the Treatment Methods for Organic Removal from Surface Water

Authors: N. Areerachakul

Abstract:

The surface water used in this study was collected from the Chao Praya River at the lower part at the Nonthaburi bridge. It was collected and used throughout the experiment. TOC (also known as DOC) in the range between 2.5 to 5.6 mg/l were investigated in this experiment. The use of conventional treatment methods such as FeCl3 and PAC showed that TOC removal was 65% using FeCl3 and 78% using PAC (powder activated carbon). The advanced oxidation process alone showed only 35% removal of TOC. Coupling advanced oxidation with a small amount of PAC (0.05g/L) increased efficiency by upto 55%. The combined BAC with advanced oxidation process and small amount of PAC demonstrated the highest efficiency of up to 95% of TOC removal and lower sludge production compared with other methods.

Keywords: Advanced oxidation process, TOC, PAC

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1758
5392 Customer Knowledge and Service Development, the Web 2.0 Role in Co-production

Authors: Roberto Boselli, Mirko Cesarini, Mario Mezzanzanica

Abstract:

The paper is concerned with relationships between SSME and ICTs and focuses on the role of Web 2.0 tools in the service development process. The research presented aims at exploring how collaborative technologies can support and improve service processes, highlighting customer centrality and value coproduction. The core idea of the paper is the centrality of user participation and the collaborative technologies as enabling factors; Wikipedia is analyzed as an example. The result of such analysis is the identification and description of a pattern characterising specific services in which users collaborate by means of web tools with value co-producers during the service process. The pattern of collaborative co-production concerning several categories of services including knowledge based services is then discussed.

Keywords: Service Interaction Patterns, Services Science, Web2.0 tools, Service Development Process.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1711
5391 Stability Bound of Ruin Probability in a Reduced Two-Dimensional Risk Model

Authors: Zina Benouaret, Djamil Aissani

Abstract:

In this work, we introduce the qualitative and quantitative concept of the strong stability method in the risk process modeling two lines of business of the same insurance company or an insurance and re-insurance companies that divide between them both claims and premiums with a certain proportion. The approach proposed is based on the identification of the ruin probability associate to the model considered, with a stationary distribution of a Markov random process called a reversed process. Our objective, after clarifying the condition and the perturbation domain of parameters, is to obtain the stability inequality of the ruin probability which is applied to estimate the approximation error of a model with disturbance parameters by the considered model. In the stability bound obtained, all constants are explicitly written.

Keywords: Markov chain, risk models, ruin probabilities, strong stability analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 859
5390 A Stochastic Diffusion Process Based on the Two-Parameters Weibull Density Function

Authors: Meriem Bahij, Ahmed Nafidi, Boujemâa Achchab, Sílvio M. A. Gama, José A. O. Matos

Abstract:

Stochastic modeling concerns the use of probability to model real-world situations in which uncertainty is present. Therefore, the purpose of stochastic modeling is to estimate the probability of outcomes within a forecast, i.e. to be able to predict what conditions or decisions might happen under different situations. In the present study, we present a model of a stochastic diffusion process based on the bi-Weibull distribution function (its trend is proportional to the bi-Weibull probability density function). In general, the Weibull distribution has the ability to assume the characteristics of many different types of distributions. This has made it very popular among engineers and quality practitioners, who have considered it the most commonly used distribution for studying problems such as modeling reliability data, accelerated life testing, and maintainability modeling and analysis. In this work, we start by obtaining the probabilistic characteristics of this model, as the explicit expression of the process, its trends, and its distribution by transforming the diffusion process in a Wiener process as shown in the Ricciaardi theorem. Then, we develop the statistical inference of this model using the maximum likelihood methodology. Finally, we analyse with simulated data the computational problems associated with the parameters, an issue of great importance in its application to real data with the use of the convergence analysis methods. Overall, the use of a stochastic model reflects only a pragmatic decision on the part of the modeler. According to the data that is available and the universe of models known to the modeler, this model represents the best currently available description of the phenomenon under consideration.

Keywords: Diffusion process, discrete sampling, likelihood estimation method, simulation, stochastic diffusion equation, trends functions, bi-parameters Weibull density function.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1955