Search results for: optimization process
5359 Integrated Wastewater Reuse Project of the Faculty of Sciences Ain Chock, Morocco
Authors: Nihad Chakri, Btissam El Amrani, Faouzi Berrada, Fouad Amraoui
Abstract:
In Morocco, water scarcity requires the exploitation of non-conventional resources. Rural areas are under-equipped with sanitation infrastructure, unlike urban areas. Decentralized and low-cost solutions could improve the quality of life of the population and the environment. In this context, the Faculty of Sciences Ain Chock (FSAC) has undertaken an integrated project to treat part of its wastewater using a decentralized compact system. The project will propose alternative solutions that are inexpensive and adapted to the context of peri-urban and rural areas in order to treat the wastewater generated and to use it for irrigation, watering and cleaning. For this purpose, several tests were carried out in the laboratory in order to develop a liquid waste treatment system optimized for local conditions. Based on the results obtained at laboratory scale of the different proposed scenarios, we designed and implemented a prototype of a mini wastewater treatment plant for the faculty. In this article, we will outline the steps of dimensioning, construction and monitoring of the mini-station in our faculty.
Keywords: Wastewater, purification, response methodology surfaces optimization, vertical filter, Moving Bed Biofilm Reactors, MBBR process, sizing, prototype, Faculty of Sciences Ain Chock, decentralized approach, mini wastewater treatment plant, reuse of treated wastewater reuse, irrigation, sustainable development.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2585358 The Effect of Solution Density on the Synthesis of Magnesium Borate from Boron-Gypsum
Authors: N. Tugrul, E. Sariburun, F. T. Senberber, A. S. Kipcak, E. Moroydor Derun, S. Piskin
Abstract:
Boron-gypsum is a waste which occurs in the boric acid production process. In this study, the boron content of this waste is evaluated for the use in synthesis of magnesium borates and such evaluation of this kind of waste is useful more than storage or disposal. Magnesium borates, which are a sub-class of boron minerals, are useful additive materials for the industries due to their remarkable thermal and mechanical properties. Magnesium borates were obtained hydrothermally at different temperatures. Novelty of this study is the search of the solution density effects to magnesium borate synthesis process for the increasing the possibility of borongypsum usage as a raw material. After the synthesis process, products are subjected to XRD and FT-IR to identify and characterize their crystal structure, respectively.
Keywords: Boron-gypsum, hydrothermal synthesis, magnesium borate, solution density.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21595357 E-Voting: A Trustworthiness In Democratic; A View from Technology, Political and Social Issue
Authors: Sera Syarmila Sameon, Rohaini Ramli
Abstract:
A trustworthy voting process in democratic is important that each vote is recorded with accuracy and impartiality. The accuracy and impartiality are tallied in high rate with biometric system. One of the sign is a fingerprint. Fingerprint recognition is still a challenging problem, because of the distortions among the different impression of the same finger. Because of the trustworthy of biometric voting technologies, it may give a great effect on numbers of voter-s participation and outcomes of the democratic process. Hence in this study, the authors are interested in designing and analyzing the Electronic Voting System and the participation of the users. The system is based on the fingerprint minutiae with the addition of person ID number. This is in order to enhance the accuracy and speed of the voting process. The new design is analyzed by conducting pilot election among a class of students for selecting their representative.Keywords: Biometric, FAR and FRR, democratic, voting
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15705356 Establishment and Evaluation of Information System for Chemotherapy Care
Authors: Yi-Ting Liu, Pei-Ying Wen
Abstract:
In order to improve the overall safety of chemotherapy, safety-protecting netwas established for the whole process from prescribing by physicians, transcribing by nurses, dispensing by pharmacists to administering by nurses. The information system was used to check and monitorwhole process of administration and related sheets were computerized to simplify the paperwork.
Keywords: Chemotherapy, Bar Code Medication Administration (BCMA), Medication Safety.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18695355 Analysis of Noodle Production Process at Yan Hu Food Manufacturing: Basis for Production Improvement
Authors: Rhadinia Tayag-Relanes, Felina C. Young
Abstract:
This study was conducted to analyze the noodle production process at Yan Hu Food Manufacturing for the basis of production improvement. The study utilized the Plan, Do, Check, Act (PDCA) approach and record review in the gathering of data for the calendar year 2019, specifically from August to October, focusing on the noodle products miki, canton, and misua. A causal-comparative research design was employed to establish cause-effect relationships among the variables, using descriptive statistics and correlation to compute the data gathered. The findings indicate that miki, canton, and misua production have distinct cycle times and production outputs in every set of its production processes, as well as varying levels of wastage. The company has not yet established a formal allowable rejection rate for wastage; instead, this paper used a 1% wastage limit. We recommended the following: machines used for each process of the noodle product must be consistently maintained and monitored; an assessment of all the production operators should be conducted by assessing their performance statistically based on the output and the machine performance; a root cause analysis must be conducted to identify solutions to production issues; and, an improved recording system for input and output of the production process of each noodle product should be established to eliminate the poor recording of data.
Keywords: Production, continuous improvement, process, operations, Plan, Do, Check, Act approach.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 295354 Studies on Lucrative Process Layout for Medium Scale Industries
Authors: Balamurugan Baladhandapani, Ganesh Renganathan, V. R. Sanal Kumar
Abstract:
In this paper a comprehensive review on various factory layouts has been carried out for designing a lucrative process layout for medium scale industries. Industry data base reveals that the end product rejection rate is on the order of 10% amounting large profit loss. In order to avoid these rejection rates and to increase the quality product production an intermediate non-destructive testing facility (INDTF) has been recommended for increasing the overall profit. We observed through detailed case studies that while introducing INDTF to medium scale industries the expensive production process can be avoided to the defective products well before its final shape. Additionally, the defective products identified during the intermediate stage can be effectively utilized for other applications or recycling; thereby the overall wastage of the raw materials can be reduced and profit can be increased. We concluded that the prudent design of a factory layout through critical path method facilitating with INDTF will warrant profitable outcome.
Keywords: Intermediate Non-destructive testing, Medium scale industries, Process layout design.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23965353 Asymmetric Tukey’s Control Chart Robust to Skew and Non-Skew Process Observation
Authors: S. Sukparungsee
Abstract:
In reality, the process observations are away from the assumption that are normal distributed. The observations could be skew distributions which should use an asymmetric chart rather than symmetric chart. Consequently, this research aim to study the robustness of the asymmetric Tukey’s control chart for skew and non-skew distributions as Lognormal and Laplace distributions. Furthermore, the performances in detecting of a change in parameter of asymmetric and symmetric Tukey’s control charts are compared by Average ARL (AARL). The results found that the asymmetric performs better than symmetric Tukey’s control chart for both cases of skew and non-skew process observation.
Keywords: Asymmetric control limit, average of average run length, Tukey’s control chart and skew distributions.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24895352 Nugget Formation during Resistance Spot Welding using Finite Element Model
Authors: Jawad Saleem, Abdul Majid, Kent Bertilsson, Torbjörn Carlberg, Nazar Ul Islam
Abstract:
Resistance spot welding process comprises of electric, thermal and mechanical phenomenon, which makes this process complex and highly non-linear and thus, it becomes difficult to model it. In order to obtain good weld nugget during spot welding, hit and trial welds are usually done which is very costly. Therefore the numerical simulation research has been conducted to understand the whole process. In this paper three different cases were analyzed by varying the tip contact area and it was observed that, with the variation of tip contact area the nugget formation at the faying surface is affected. The tip contact area of the welding electrode becomes large with long welding cycles. Therefore in order to maintain consistency of nugget formation during the welding process, the current compensation in control feedback is required. If the contact area of the welding electrode tip is reduced, a large amount of current flows through the faying surface, as a result of which sputtering occurs.Keywords: Resistance spot welding, Finite element modeling, Nugget formation, Welding electrode, Numerical method simulation,
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 37855351 Improvement of GVPI Insulation System Characteristics by Curing Process Modification
Authors: M. Shadmand
Abstract:
The curing process of insulation system for electrical machines plays a determinative role for its durability and reliability. Polar structure of insulating resin molecules and used filler of insulation system can be taken as an occasion to leverage it to enhance overall characteristics of insulation system, mechanically and electrically. The curing process regime for insulating system plays an important role for its mechanical and electrical characteristics by arranging the polymerization of chain structure for resin. In this research, the effect of electrical field application on in-curing insulating system for Global Vacuum Pressurized Impregnation (GVPI) system for traction motor was considered by performing the dissipation factor, polarization and de-polarization current (PDC) and voltage endurance (aging) measurements on sample test objects. Outcome results depicted obvious improvement in mechanical strength of the insulation system as well as higher electrical characteristics with routing and long-time (aging) electrical tests. Coming together, polarization of insulation system during curing process would enhance the machine life time.Keywords: Insulation system, GVPI, PDC, aging.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10835350 The Forensic Swing of Things: The Current Legal and Technical Challenges of IoT Forensics
Authors: Pantaleon Lutta, Mohamed Sedky, Mohamed Hassan
Abstract:
The inability of organizations to put in place management control measures for Internet of Things (IoT) complexities persists to be a risk concern. Policy makers have been left to scamper in finding measures to combat these security and privacy concerns. IoT forensics is a cumbersome process as there is no standardization of the IoT products, no or limited historical data are stored on the devices. This paper highlights why IoT forensics is a unique adventure and brought out the legal challenges encountered in the investigation process. A quadrant model is presented to study the conflicting aspects in IoT forensics. The model analyses the effectiveness of forensic investigation process versus the admissibility of the evidence integrity; taking into account the user privacy and the providers’ compliance with the laws and regulations. Our analysis concludes that a semi-automated forensic process using machine learning, could eliminate the human factor from the profiling and surveillance processes, and hence resolves the issues of data protection (privacy and confidentiality).
Keywords: Cloud forensics, data protection laws, GDPR, IoT forensics, machine learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10995349 The Study of the Discrete Risk Model with Random Income
Authors: Peichen Zhao
Abstract:
In this paper, we extend the compound binomial model to the case where the premium income process, based on a binomial process, is no longer a linear function. First, a mathematically recursive formula is derived for non ruin probability, and then, we examine the expected discounted penalty function, satisfy a defect renewal equation. Third, the asymptotic estimate for the expected discounted penalty function is then given. Finally, we give two examples of ruin quantities to illustrate applications of the recursive formula and the asymptotic estimate for penalty function.
Keywords: Discounted penalty function, compound binomial process, recursive formula, discrete renewal equation, asymptotic estimate.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14235348 XML Data Management in Compressed Relational Database
Authors: Hongzhi Wang, Jianzhong Li, Hong Gao
Abstract:
XML is an important standard of data exchange and representation. As a mature database system, using relational database to support XML data may bring some advantages. But storing XML in relational database has obvious redundancy that wastes disk space, bandwidth and disk I/O when querying XML data. For the efficiency of storage and query XML, it is necessary to use compressed XML data in relational database. In this paper, a compressed relational database technology supporting XML data is presented. Original relational storage structure is adaptive to XPath query process. The compression method keeps this feature. Besides traditional relational database techniques, additional query process technologies on compressed relations and for special structure for XML are presented. In this paper, technologies for XQuery process in compressed relational database are presented..Keywords: XML, compression, query processing
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18065347 Solving Weighted Number of Operation Plus Processing Time Due-Date Assignment, Weighted Scheduling and Process Planning Integration Problem Using Genetic and Simulated Annealing Search Methods
Authors: Halil Ibrahim Demir, Caner Erden, Mumtaz Ipek, Ozer Uygun
Abstract:
Traditionally, the three important manufacturing functions, which are process planning, scheduling and due-date assignment, are performed separately and sequentially. For couple of decades, hundreds of studies are done on integrated process planning and scheduling problems and numerous researches are performed on scheduling with due date assignment problem, but unfortunately the integration of these three important functions are not adequately addressed. Here, the integration of these three important functions is studied by using genetic, random-genetic hybrid, simulated annealing, random-simulated annealing hybrid and random search techniques. As well, the importance of the integration of these three functions and the power of meta-heuristics and of hybrid heuristics are studied.
Keywords: Process planning, weighted scheduling, weighted due-date assignment, genetic search, simulated annealing, hybrid meta-heuristics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15885346 Normalization and Constrained Optimization of Measures of Fuzzy Entropy
Authors: K.C. Deshmukh, P.G. Khot, Nikhil
Abstract:
In the literature of information theory, there is necessity for comparing the different measures of fuzzy entropy and this consequently, gives rise to the need for normalizing measures of fuzzy entropy. In this paper, we have discussed this need and hence developed some normalized measures of fuzzy entropy. It is also desirable to maximize entropy and to minimize directed divergence or distance. Keeping in mind this idea, we have explained the method of optimizing different measures of fuzzy entropy.Keywords: Fuzzy set, Uncertainty, Fuzzy entropy, Normalization, Membership function
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14735345 Environmental Decision Making Model for Assessing On-Site Performances of Building Subcontractors
Authors: Buket Metin
Abstract:
Buildings cause a variety of loads on the environment due to activities performed at each stage of the building life cycle. Construction is the first stage that affects both the natural and built environments at different steps of the process, which can be defined as transportation of materials within the construction site, formation and preparation of materials on-site and the application of materials to realize the building subsystems. All of these steps require the use of technology, which varies based on the facilities that contractors and subcontractors have. Hence, environmental consequences of the construction process should be tackled by focusing on construction technology options used in every step of the process. This paper presents an environmental decision-making model for assessing on-site performances of subcontractors based on the construction technology options which they can supply. First, construction technologies, which constitute information, tools and methods, are classified. Then, environmental performance criteria are set forth related to resource consumption, ecosystem quality, and human health issues. Finally, the model is developed based on the relationships between the construction technology components and the environmental performance criteria. The Fuzzy Analytical Hierarchy Process (FAHP) method is used for weighting the environmental performance criteria according to environmental priorities of decision-maker(s), while the Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) method is used for ranking on-site environmental performances of subcontractors using quantitative data related to the construction technology components. Thus, the model aims to provide an insight to decision-maker(s) about the environmental consequences of the construction process and to provide an opportunity to improve the overall environmental performance of construction sites.
Keywords: Construction process, construction technology, decision making, environmental performance, subcontractors.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11725344 QR Technology to Automate Health Condition Detection Payment System: A Case Study in Schools of the Kingdom of Saudi Arabia
Authors: Amjad Alsulami, Farah Albishri, Kholod Alzubidi, Lama Almehemadi, Salma Elhag
Abstract:
Food allergy is a common and rising problem among children. Many students have their first allergic reaction at school, one of these is anaphylaxis, which can be fatal. This study discovered that several schools' processes lacked safety regulations and information on how to handle allergy issues and chronic diseases like diabetes where students were not supervised or monitored during the cafeteria purchasing process. Academic institutions have no obvious prevention or effort when purchasing food containing allergens or negatively impacting the health status of students who suffer from chronic diseases. The stability of students' health must be maintained because it greatly affects their performance and educational achievement. To address this issue, this paper uses a business reengineering process to propose the automation of the whole food-purchasing process, which will aid in detecting and avoiding allergic occurrences and preventing any side effects from eating foods that are conflicting with students' health. This may be achieved by designing a smart card with an embedded QR code that reveals which foods cause an allergic reaction in a student. A survey was distributed to determine and examine how the cafeteria will handle allergic children and whether any management or policy is applied in the school. Also, the survey findings indicate that the integration of QR technology into the food purchasing process would improve health condition detection. The family supported that the suggested solution would be advantageous because it ensured their children avoided eating not allowed food. Moreover, by analyzing and simulating the as-is process and the suggested process, the results demonstrate that there is an improvement in quality and time.
Keywords: QR code, smart card, food allergies, Business Process reengineering, health condition detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3575343 CO2 Emission and Cost Optimization of Reinforced Concrete Frame Designed by Performance Based Design Approach
Authors: Jin Woo Hwang, Byung Kwan Oh, Yousok Kim, Hyo Seon Park
Abstract:
As greenhouse effect has been recognized as serious environmental problem of the world, interests in carbon dioxide (CO2) emission which comprises major part of greenhouse gas (GHG) emissions have been increased recently. Since construction industry takes a relatively large portion of total CO2 emissions of the world, extensive studies about reducing CO2 emissions in construction and operation of building have been carried out after the 2000s. Also, performance based design (PBD) methodology based on nonlinear analysis has been robustly developed after Northridge Earthquake in 1994 to assure and assess seismic performance of building more exactly because structural engineers recognized that prescriptive code based design approach cannot address inelastic earthquake responses directly and assure performance of building exactly. Although CO2 emissions and PBD approach are recent rising issues on construction industry and structural engineering, there were few or no researches considering these two issues simultaneously. Thus, the objective of this study is to minimize the CO2 emissions and cost of building designed by PBD approach in structural design stage considering structural materials. 4 story and 4 span reinforced concrete building optimally designed to minimize CO2 emissions and cost of building and to satisfy specific seismic performance (collapse prevention in maximum considered earthquake) of building satisfying prescriptive code regulations using non-dominated sorting genetic algorithm-II (NSGA-II). Optimized design result showed that minimized CO2 emissions and cost of building were acquired satisfying specific seismic performance. Therefore, the methodology proposed in this paper can be used to reduce both CO2 emissions and cost of building designed by PBD approach.
Keywords: CO2 emissions, performance based design, optimization, sustainable design.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18675342 Treatment of Oily Wastewater by Fibrous Coalescer Process: Stage Coalescer and Model Prediction
Authors: Pisut Painmanakul, Kotchakorn Kongkangwarn, Nattawin Chawaloesphonsiya
Abstract:
The coalescer process is one of the methods for oily water treatment by increasing the oil droplet size in order to enhance the separating velocity and thus effective separation. However, the presence of surfactants in an oily emulsion can limit the obtained mechanisms due to the small oil size related with stabilized emulsion. In this regard, the purpose of this research is to improve the efficiency of the coalescer process for treating the stabilized emulsion. The effects of bed types, bed height, liquid flow rate and stage coalescer (step-bed) on the treatment efficiencies in term of COD values were studied. Note that the treatment efficiency obtained experimentally was estimated by using the COD values and oil droplet size distribution. The study has shown that the plastic media has more effective to attach with oil particles than the stainless one due to their hydrophobic properties. Furthermore, the suitable bed height (3.5 cm) and step bed (3.5 cm with 2 steps) were necessary in order to well obtain the coalescer performance. The application of step bed coalescer process in reactor has provided the higher treatment efficiencies in term of COD removal than those obtained with classical process. The proposed model for predicting the area under curve and thus treatment efficiency, based on the single collector efficiency (ηT) and the attachment efficiency (α), provides relatively a good coincidence between the experimental and predicted values of treatment efficiencies in this study.
Keywords: Stage coalescer, stabilized emulsions, treatment efficiency, model prediction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21955341 RDFGraph: New Data Modeling Tool for Semantic Web
Authors: Daniel Siahaan, Aditya Prapanca
Abstract:
The emerging Semantic Web has been attracted many researchers and developers. New applications have been developed on top of Semantic Web and many supporting tools introduced to improve its software development process. Metadata modeling is one of development process where supporting tools exists. The existing tools are lack of readability and easiness for a domain knowledge expert to graphically models a problem in semantic model. In this paper, a metadata modeling tool called RDFGraph is proposed. This tool is meant to solve those problems. RDFGraph is also designed to work with modern database management systems that support RDF and to improve the performance of the query execution process. The testing result shows that the rules used in RDFGraph follows the W3C standard and the graphical model produced in this tool is properly translated and correct.Keywords: CASE tool, data modeling, semantic web
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20905340 A Refined Energy-Based Model for Friction-Stir Welding
Authors: Samir A. Emam, Ali El Domiaty
Abstract:
Friction-stir welding has received a huge interest in the last few years. The many advantages of this promising process have led researchers to present different theoretical and experimental explanation of the process. The way to quantitatively and qualitatively control the different parameters of the friction-stir welding process has not been paved. In this study, a refined energybased model that estimates the energy generated due to friction and plastic deformation is presented. The effect of the plastic deformation at low energy levels is significant and hence a scale factor is introduced to control its effect. The predicted heat energy and the obtained maximum temperature using our model are compared to the theoretical and experimental results available in the literature and a good agreement is obtained. The model is applied to AA6000 and AA7000 series.
Keywords: Friction-stir welding, Energy, Aluminum Alloys.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17505339 The Adoption of Process Management for Accounting Information Systems in Thailand
Authors: Manirath Wongsim, Pawornprat Hongsakon
Abstract:
Information Quality (IQ) has become a critical, strategic issue in Accounting Information Systems (AIS) adoption. In order to implement AIS adoption successfully, it is important to consider the quality of information use throughout the adoption process, which seriously impacts the effectiveness of AIS adoption practice and the optimisation of AIS adoption decisions. There is a growing need for research to provide insights into issues and solutions related to IQ in AIS adoption. The need for an integrated approach to improve IQ in AIS adoption, as well as the unique characteristics of accounting data, demands an AIS adoption specific IQ framework. This research aims to explore ways of managing information quality and AIS adoption to investigate the relationship between the IQ issues and AIS adoption process. This study has led to the development of a framework for understanding IQ management in AIS adoption. This research was done on 44 respondents as ten organisations from manufacturing firms in Thailand. The findings of the research’s empirical evidence suggest that IQ dimensions in AIS adoption to provide assistance in all process of decision making. This research provides empirical evidence that information quality of AIS adoption affect decision making and suggests that these variables should be considered in adopting AIS in order to improve the effectiveness of AIS.Keywords: Information quality, information quality dimensions, accounting information systems, accounting Information system adoption.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30625338 The Coupling of Photocatalytic Oxidation Processes with Activated Carbon Technologies and the Comparison of the Treatment Methods for Organic Removal from Surface Water
Authors: N. Areerachakul
Abstract:
The surface water used in this study was collected from the Chao Praya River at the lower part at the Nonthaburi bridge. It was collected and used throughout the experiment. TOC (also known as DOC) in the range between 2.5 to 5.6 mg/l were investigated in this experiment. The use of conventional treatment methods such as FeCl3 and PAC showed that TOC removal was 65% using FeCl3 and 78% using PAC (powder activated carbon). The advanced oxidation process alone showed only 35% removal of TOC. Coupling advanced oxidation with a small amount of PAC (0.05g/L) increased efficiency by upto 55%. The combined BAC with advanced oxidation process and small amount of PAC demonstrated the highest efficiency of up to 95% of TOC removal and lower sludge production compared with other methods.
Keywords: Advanced oxidation process, TOC, PAC
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17845337 Customer Knowledge and Service Development, the Web 2.0 Role in Co-production
Authors: Roberto Boselli, Mirko Cesarini, Mario Mezzanzanica
Abstract:
The paper is concerned with relationships between SSME and ICTs and focuses on the role of Web 2.0 tools in the service development process. The research presented aims at exploring how collaborative technologies can support and improve service processes, highlighting customer centrality and value coproduction. The core idea of the paper is the centrality of user participation and the collaborative technologies as enabling factors; Wikipedia is analyzed as an example. The result of such analysis is the identification and description of a pattern characterising specific services in which users collaborate by means of web tools with value co-producers during the service process. The pattern of collaborative co-production concerning several categories of services including knowledge based services is then discussed.Keywords: Service Interaction Patterns, Services Science, Web2.0 tools, Service Development Process.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17275336 Stability Bound of Ruin Probability in a Reduced Two-Dimensional Risk Model
Authors: Zina Benouaret, Djamil Aissani
Abstract:
In this work, we introduce the qualitative and quantitative concept of the strong stability method in the risk process modeling two lines of business of the same insurance company or an insurance and re-insurance companies that divide between them both claims and premiums with a certain proportion. The approach proposed is based on the identification of the ruin probability associate to the model considered, with a stationary distribution of a Markov random process called a reversed process. Our objective, after clarifying the condition and the perturbation domain of parameters, is to obtain the stability inequality of the ruin probability which is applied to estimate the approximation error of a model with disturbance parameters by the considered model. In the stability bound obtained, all constants are explicitly written.Keywords: Markov chain, risk models, ruin probabilities, strong stability analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8895335 A Stochastic Diffusion Process Based on the Two-Parameters Weibull Density Function
Authors: Meriem Bahij, Ahmed Nafidi, Boujemâa Achchab, Sílvio M. A. Gama, José A. O. Matos
Abstract:
Stochastic modeling concerns the use of probability to model real-world situations in which uncertainty is present. Therefore, the purpose of stochastic modeling is to estimate the probability of outcomes within a forecast, i.e. to be able to predict what conditions or decisions might happen under different situations. In the present study, we present a model of a stochastic diffusion process based on the bi-Weibull distribution function (its trend is proportional to the bi-Weibull probability density function). In general, the Weibull distribution has the ability to assume the characteristics of many different types of distributions. This has made it very popular among engineers and quality practitioners, who have considered it the most commonly used distribution for studying problems such as modeling reliability data, accelerated life testing, and maintainability modeling and analysis. In this work, we start by obtaining the probabilistic characteristics of this model, as the explicit expression of the process, its trends, and its distribution by transforming the diffusion process in a Wiener process as shown in the Ricciaardi theorem. Then, we develop the statistical inference of this model using the maximum likelihood methodology. Finally, we analyse with simulated data the computational problems associated with the parameters, an issue of great importance in its application to real data with the use of the convergence analysis methods. Overall, the use of a stochastic model reflects only a pragmatic decision on the part of the modeler. According to the data that is available and the universe of models known to the modeler, this model represents the best currently available description of the phenomenon under consideration.Keywords: Diffusion process, discrete sampling, likelihood estimation method, simulation, stochastic diffusion equation, trends functions, bi-parameters Weibull density function.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19675334 Application of Single Tuned Passive Filters in Distribution Networks at the Point of Common Coupling
Authors: M. Almutairi, S. Hadjiloucas
Abstract:
The harmonic distortion of voltage is important in relation to power quality due to the interaction between the large diffusion of non-linear and time-varying single-phase and three-phase loads with power supply systems. However, harmonic distortion levels can be reduced by improving the design of polluting loads or by applying arrangements and adding filters. The application of passive filters is an effective solution that can be used to achieve harmonic mitigation mainly because filters offer high efficiency, simplicity, and are economical. Additionally, possible different frequency response characteristics can work to achieve certain required harmonic filtering targets. With these ideas in mind, the objective of this paper is to determine what size single tuned passive filters work in distribution networks best, in order to economically limit violations caused at a given point of common coupling (PCC). This article suggests that a single tuned passive filter could be employed in typical industrial power systems. Furthermore, constrained optimization can be used to find the optimal sizing of the passive filter in order to reduce both harmonic voltage and harmonic currents in the power system to an acceptable level, and, thus, improve the load power factor. The optimization technique works to minimize voltage total harmonic distortions (VTHD) and current total harmonic distortions (ITHD), where maintaining a given power factor at a specified range is desired. According to the IEEE Standard 519, both indices are viewed as constraints for the optimal passive filter design problem. The performance of this technique will be discussed using numerical examples taken from previous publications.
Keywords: Harmonics, passive filter, power factor, power quality.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21915333 Learning Materials of Atmospheric Pressure Plasma Process: Turning Hydrophilic Surface to Hydrophobic
Authors: C.W. Kan
Abstract:
This paper investigates the use of atmospheric pressure plasma for improving the surface hydrophobicity of polyurethane synthetic leather with tetramethylsilane (TMS). The atmospheric pressure plasma treatment with TMS is a single-step process to enhance the hydrophobicity of polyurethane synthetic leather. The hydrophobicity of the treated surface was examined by contact angle measurement. The physical and chemical surface changes were evaluated by scanning electron microscopy (SEM) and infrared spectroscopy (FTIR). The purpose of this paper is to provide learning materials for understanding how to use atmospheric pressure plasma in the textile finishing process to transform a hydrophilic surface to hydrophobic.
Keywords: Learning materials, atmospheric pressure plasma treatment, hydrophobic, hydrophilic, surface.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17855332 Optimal Design of Selective Excitation Pulses in Magnetic Resonance Imaging using Genetic Algorithms
Authors: Mohammed A. Alolfe, Abou-Bakr M. Youssef, Yasser M. Kadah
Abstract:
The proper design of RF pulses in magnetic resonance imaging (MRI) has a direct impact on the quality of acquired images, and is needed for many applications. Several techniques have been proposed to obtain the RF pulse envelope given the desired slice profile. Unfortunately, these techniques do not take into account the limitations of practical implementation such as limited amplitude resolution. Moreover, implementing constraints for special RF pulses on most techniques is not possible. In this work, we propose to develop an approach for designing optimal RF pulses under theoretically any constraints. The new technique will pose the RF pulse design problem as a combinatorial optimization problem and uses efficient techniques from this area such as genetic algorithms (GA) to solve this problem. In particular, an objective function will be proposed as the norm of the difference between the desired profile and the one obtained from solving the Bloch equations for the current RF pulse design values. The proposed approach will be verified using analytical solution based RF simulations and compared to previous methods such as Shinnar-Le Roux (SLR) method, and analysis, selected, and tested the options and parameters that control the Genetic Algorithm (GA) can significantly affect its performance to get the best improved results and compared to previous works in this field. The results show a significant improvement over conventional design techniques, select the best options and parameters for GA to get most improvement over the previous works, and suggest the practicality of using of the new technique for most important applications as slice selection for large flip angles, in the area of unconventional spatial encoding, and another clinical use.
Keywords: Selective excitation, magnetic resonance imaging, combinatorial optimization, pulse design.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16125331 Solving Process Planning, Weighted Earliest Due Date Scheduling and Weighted Due Date Assignment Using Simulated Annealing and Evolutionary Strategies
Authors: Halil Ibrahim Demir, Abdullah Hulusi Kokcam, Fuat Simsir, Özer Uygun
Abstract:
Traditionally, three important manufacturing functions which are process planning, scheduling and due-date assignment are performed sequentially and separately. Although there are numerous works on the integration of process planning and scheduling and plenty of works focusing on scheduling with due date assignment, there are only a few works on integrated process planning, scheduling and due-date assignment. Although due-dates are determined without taking into account of weights of the customers in the literature, here weighted due-date assignment is employed to get better performance. Jobs are scheduled according to weighted earliest due date dispatching rule and due dates are determined according to some popular due date assignment methods by taking into account of the weights of each job. Simulated Annealing, Evolutionary Strategies, Random Search, hybrid of Random Search and Simulated Annealing, and hybrid of Random Search and Evolutionary Strategies, are applied as solution techniques. Three important manufacturing functions are integrated step-by-step and higher integration levels are found better. Search meta-heuristics are found to be very useful while improving performance measure.
Keywords: Evolutionary strategies, hybrid searches, process planning, simulated annealing, weighted due-date assignment, weighted scheduling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11585330 Chatter Suppression in Boring Process Using Passive Damper
Authors: V. Prasannavenkadesan, A. Elango, S. Chockalingam
Abstract:
During machining process, chatter is an unavoidable phenomenon. Boring bars possess the cantilever shape and due to this, it is subjected to chatter. The adverse effect of chatter includes the increase in temperature which will leads to excess tool wear. To overcome these problems, in this investigation, Cartridge brass (Cu – 70% and Zn – 30%) is passively fixed on the boring bar and also clearance is provided in order to reduce the displacement, tool wear and cutting temperature. A conventional all geared lathe is attached with vibrometer and pyrometer is used to measure the displacement and temperature. The influence of input parameters such as cutting speed, depth of cut and clearance on temperature, tool wear and displacement are investigated for various cutting conditions. From the result, the optimum conditions to obtain better damping in boring process for chatter reduction is identified.Keywords: Boring, chatter, mass damping, passive damping.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2949