Search results for: Dirichlet process mixture model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 28847

Search results for: Dirichlet process mixture model

28367 Simulation-Based Optimization Approach for an Electro-Plating Production Process Based on Theory of Constraints and Data Envelopment Analysis

Authors: Mayada Attia Ibrahim

Abstract:

Evaluating and developing the electroplating production process is a key challenge in this type of process. The process is influenced by several factors such as process parameters, process costs, and production environments. Analyzing and optimizing all these factors together requires extensive analytical techniques that are not available in real-case industrial entities. This paper presents a practice-based framework for the evaluation and optimization of some of the crucial factors that affect the costs and production times associated with this type of process, energy costs, material costs, and product flow times. The proposed approach uses Design of Experiments, Discrete-Event Simulation, and Theory of Constraints were respectively used to identify the most significant factors affecting the production process and simulate a real production line to recognize the effect of these factors and assign possible bottlenecks. Several scenarios are generated as corrective strategies for improving the production line. Following that, data envelopment analysis CCR input-oriented DEA model is used to evaluate and optimize the suggested scenarios.

Keywords: electroplating process, simulation, design of experiment, performance optimization, theory of constraints, data envelopment analysis

Procedia PDF Downloads 85
28366 Hawkes Process-Based Reflexivity Analysis in the Cryptocurrency Market

Authors: Alev Atak

Abstract:

We study the endogeneity in the cryptocurrency market over the branching ratio of the Hawkes process and evaluate the movement of self-excitability in the financial markets. We consider a semi-parametric self-exciting point process regression model where the excitation function is assumed to be smooth and decreasing but otherwise unspecified, and the baseline intensity is assumed to be a linear function of the regressors. We apply the empirical analysis to the three largest crypto assets, i.e. Bitcoin - Ethereum - Ripple, and provide a comparison with other financial assets such as SP500, Gold, and the volatility index VIX observed from January 2015 to December 2020. The results depict variable and high levels of endogeneity in the basket of cryptocurrencies under investigation, underlining the evidence of a significant role of endogenous feedback mechanisms in the price formation process.

Keywords: hawkes process, cryptocurrency, endogeneity, reflexivity

Procedia PDF Downloads 71
28365 Parameter Estimation for the Mixture of Generalized Gamma Model

Authors: Wikanda Phaphan

Abstract:

Mixture generalized gamma distribution is a combination of two distributions: generalized gamma distribution and length biased generalized gamma distribution. These two distributions were presented by Suksaengrakcharoen and Bodhisuwan in 2014. The findings showed that probability density function (pdf) had fairly complexities, so it made problems in estimating parameters. The problem occurred in parameter estimation was that we were unable to calculate estimators in the form of critical expression. Thus, we will use numerical estimation to find the estimators. In this study, we presented a new method of the parameter estimation by using the expectation – maximization algorithm (EM), the conjugate gradient method, and the quasi-Newton method. The data was generated by acceptance-rejection method which is used for estimating α, β, λ and p. λ is the scale parameter, p is the weight parameter, α and β are the shape parameters. We will use Monte Carlo technique to find the estimator's performance. Determining the size of sample equals 10, 30, 100; the simulations were repeated 20 times in each case. We evaluated the effectiveness of the estimators which was introduced by considering values of the mean squared errors and the bias. The findings revealed that the EM-algorithm had proximity to the actual values determined. Also, the maximum likelihood estimators via the conjugate gradient and the quasi-Newton method are less precision than the maximum likelihood estimators via the EM-algorithm.

Keywords: conjugate gradient method, quasi-Newton method, EM-algorithm, generalized gamma distribution, length biased generalized gamma distribution, maximum likelihood method

Procedia PDF Downloads 210
28364 Good Practices for Model Structure Development and Managing Structural Uncertainty in Decision Making

Authors: Hossein Afzali

Abstract:

Increasingly, decision analytic models are used to inform decisions about whether or not to publicly fund new health technologies. It is well noted that the accuracy of model predictions is strongly influenced by the appropriateness of model structuring. However, there is relatively inadequate methodological guidance surrounding this issue in guidelines developed by national funding bodies such as the Australian Pharmaceutical Benefits Advisory Committee (PBAC) and The National Institute for Health and Care Excellence (NICE) in the UK. This presentation aims to discuss issues around model structuring within decision making with a focus on (1) the need for a transparent and evidence-based model structuring process to inform the most appropriate set of structural aspects as the base case analysis; (2) the need to characterise structural uncertainty (If there exist alternative plausible structural assumptions (or judgements), there is a need to appropriately characterise the related structural uncertainty). The presentation will provide an opportunity to share ideas and experiences on how the guidelines developed by national funding bodies address the above issues and identify areas for further improvements. First, a review and analysis of the literature and guidelines developed by PBAC and NICE will be provided. Then, it will be discussed how the issues around model structuring (including structural uncertainty) are not handled and justified in a systematic way within the decision-making process, its potential impact on the quality of public funding decisions, and how it should be presented in submissions to national funding bodies. This presentation represents a contribution to the good modelling practice within the decision-making process. Although the presentation focuses on the PBAC and NICE guidelines, the discussion can be applied more widely to many other national funding bodies that use economic evaluation to inform funding decisions but do not transparently address model structuring issues e.g. the Medical Services Advisory Committee (MSAC) in Australia or the Canadian Agency for Drugs and Technologies in Health.

Keywords: decision-making process, economic evaluation, good modelling practice, structural uncertainty

Procedia PDF Downloads 171
28363 Graphene/ZnO/Polymer Nanocomposite Thin Film for Separation of Oil-Water Mixture

Authors: Suboohi Shervani, Jingjing Ling, Jiabin Liu, Tahir Husain

Abstract:

Offshore oil-spill has become the most emerging problem in the world. In the current paper, a graphene/ZnO/polymer nanocomposite thin film is coated on stainless steel mesh via layer by layer deposition method. The structural characterization of materials is determined by Scanning Electron Microscopy (SEM) and X-ray diffraction (XRD). The total petroleum hydrocarbons (TPHs) and separation efficiency have been measured via gas chromatography – flame ionization detector (GC-FID). TPHs are reduced to 2 ppm and separation efficiency of the nanocomposite coated mesh is reached ≥ 99% for the final sample. The nanocomposite coated mesh acts as a promising candidate for the separation of oil- water mixture.

Keywords: oil spill, graphene, oil-water separation, nanocomposite

Procedia PDF Downloads 155
28362 Production of Ultra-Low Temperature by the Vapor Compression Refrigeration Cycles with Environment Friendly Working Fluids

Authors: Sameh Frikha, Mohamed Salah Abid

Abstract:

We investigate the performance of an integrated cascade (IC) refrigeration system which uses environment friendly zeotropic mixtures. Computational calculation has been carried out by varying pressure level at the evaporator and the condenser of the system. Effects of mass flow rate of the refrigerant on the coefficient of performance (COP) are presented. We show that the integrated cascade system produces ultra-low temperatures in the evaporator by using environment friendly zeotropic mixture.

Keywords: coefficient of performance, environment friendly zeotropic mixture, integrated cascade, ultra low temperature, vapor compression refrigeration cycles

Procedia PDF Downloads 249
28361 Kinetic Façade Design Using 3D Scanning to Convert Physical Models into Digital Models

Authors: Do-Jin Jang, Sung-Ah Kim

Abstract:

In designing a kinetic façade, it is hard for the designer to make digital models due to its complex geometry with motion. This paper aims to present a methodology of converting a point cloud of a physical model into a single digital model with a certain topology and motion. The method uses a Microsoft Kinect sensor, and color markers were defined and applied to three paper folding-inspired designs. Although the resulted digital model cannot represent the whole folding range of the physical model, the method supports the designer to conduct a performance-oriented design process with the rough physical model in the reduced folding range.

Keywords: design media, kinetic facades, tangible user interface, 3D scanning

Procedia PDF Downloads 405
28360 The Framework of System Safety for Multi Human-in-The-Loop System

Authors: Hideyuki Shintani, Ichiro Koshijima

Abstract:

In Cyber Physical System (CPS), if there are a large number of persons in the process, a role of person in CPS might be different comparing with the one-man system. It is also necessary to consider how Human-in-The-Loop Cyber Physical Systems (HiTLCPS) ensure safety of each person in the loop process. In this paper, the authors discuss a system safety framework with an illustrative example with STAMP model to clarify what point for safety should be considered and what role of person in the should have.

Keywords: cyber-physical-system, human-in-the-loop, safety, STAMP model

Procedia PDF Downloads 313
28359 Unsteady Rayleigh-Bénard Convection of Nanoliquids in Enclosures

Authors: P. G. Siddheshwar, B. N. Veena

Abstract:

Rayleigh-B´enard convection of a nanoliquid in shallow, square and tall enclosures is studied using the Khanafer-Vafai-Lightstone single-phase model. The thermophysical properties of water, copper, copper-oxide, alumina, silver and titania at 3000 K under stagnant conditions that are collected from literature are used in calculating thermophysical properties of water-based nanoliquids. Phenomenological laws and mixture theory are used for calculating thermophysical properties. Free-free, rigid-rigid and rigid-free boundary conditions are considered in the study. Intractable Lorenz model for each boundary combination is derived and then reduced to the tractable Ginzburg-Landau model. The amplitude thus obtained is used to quantify the heat transport in terms of Nusselt number. Addition of nanoparticles is shown not to alter the influence of the nature of boundaries on the onset of convection as well as on heat transport. Amongst the three enclosures considered, it is found that tall and shallow enclosures transport maximum and minimum energy respectively. Enhancement of heat transport due to nanoparticles in the three enclosures is found to be in the range 3% - 11%. Comparison of results in the case of rigid-rigid boundaries is made with those of an earlier work and good agreement is found. The study has limitations in the sense that thermophysical properties are calculated by using various quantities modelled for static condition.

Keywords: enclosures, free-free, rigid-rigid, rigid-free boundaries, Ginzburg-Landau model, Lorenz model

Procedia PDF Downloads 238
28358 Logistic Regression Model versus Additive Model for Recurrent Event Data

Authors: Entisar A. Elgmati

Abstract:

Recurrent infant diarrhea is studied using daily data collected in Salvador, Brazil over one year and three months. A logistic regression model is fitted instead of Aalen's additive model using the same covariates that were used in the analysis with the additive model. The model gives reasonably similar results to that using additive regression model. In addition, the problem with the estimated conditional probabilities not being constrained between zero and one in additive model is solved here. Also martingale residuals that have been used to judge the goodness of fit for the additive model are shown to be useful for judging the goodness of fit of the logistic model.

Keywords: additive model, cumulative probabilities, infant diarrhoea, recurrent event

Procedia PDF Downloads 622
28357 The BL-5D Model: The Development of a Model of Instructional Design for Blended Learning Activities

Authors: Damian Gordon, Paul Doyle, Anna Becevel, Júlia Vilafranca Molero, Cinta Gascon, Arianna Vitiello, Tina Baloh

Abstract:

It has long been recognized that the creation of any teaching content can be enhanced if the development process follows a pre-defined approach, which is often referred to as an instructional design methodology. These methodologies typically define a number of stages, or phases, that an educator should undertake to help ensure the quality of the final teaching content that is developed. In this paper, we present an instructional design methodology that is focused specifically on the introduction of blended resources into a heretofore bricks-and-mortar course. To achieve this, research was undertaken concerning a range of models of instructional design, as well as literature covering some of the key challenges and “pain points” of blending. Following this, our model, the BL-5D model, is presented, which incorporates some key questions at each stage of this five-stage methodology to guide the development process. Finally, a discussion of some of the key themes and issues that have been uncovered in this work is presented, as well as a template for a blended learning case study that emerged from this approach.

Keywords: blended learning, challenges of blended learning, design methodologies, instructional design

Procedia PDF Downloads 97
28356 A Simulated Evaluation of Model Predictive Control

Authors: Ahmed AlNouss, Salim Ahmed

Abstract:

Process control refers to the techniques to control the variables in a process in order to maintain them at their desired values. Advanced process control (APC) is a broad term within the domain of control where it refers to different kinds of process control and control related tools, for example, model predictive control (MPC), statistical process control (SPC), fault detection and classification (FDC) and performance assessment. APC is often used for solving multivariable control problems and model predictive control (MPC) is one of only a few advanced control methods used successfully in industrial control applications. Advanced control is expected to bring many benefits to the plant operation; however, the extent of the benefits is plant specific and the application needs a large investment. This requires an analysis of the expected benefits before the implementation of the control. In a real plant simulation studies are carried out along with some experimentation to determine the improvement in the performance of the plant due to advanced control. In this research, such an exercise is undertaken to realize the needs of APC application. The main objectives of the paper are as follows: (1) To apply MPC to a number of simulations set up to realize the need of MPC by comparing its performance with that of proportional integral derivatives (PID) controllers. (2) To study the effect of controller parameters on control performance. (3) To develop appropriate performance index (PI) to compare the performance of different controller and develop novel idea to present tuning map of a controller. These objectives were achieved by applying PID controller and a special type of MPC which is dynamic matrix control (DMC) on the multi-tanks process simulated in loop-pro. Then the controller performance has been evaluated by changing the controller parameters. This performance was based on special indices related to the difference between set point and process variable in order to compare the both controllers. The same principle was applied for continuous stirred tank heater (CSTH) and continuous stirred tank reactor (CSTR) processes simulated in Matlab. However, in these processes some developed programs were written to evaluate the performance of the PID and MPC controllers. Finally these performance indices along with their controller parameters were plotted using special program called Sigmaplot. As a result, the improvement in the performance of the control loops was quantified using relevant indices to justify the need and importance of advanced process control. Also, it has been approved that, by using appropriate indices, predictive controller can improve the performance of the control loop significantly.

Keywords: advanced process control (APC), control loop, model predictive control (MPC), proportional integral derivatives (PID), performance indices (PI)

Procedia PDF Downloads 397
28355 Method for Tuning Level Control Loops Based on Internal Model Control and Closed Loop Step Test Data

Authors: Arnaud Nougues

Abstract:

This paper describes a two-stage methodology derived from internal model control (IMC) for tuning a proportional-integral-derivative (PID) controller for levels or other integrating processes in an industrial environment. Focus is the ease of use and implementation speed which are critical for an industrial application. Tuning can be done with minimum effort and without the need for time-consuming open-loop step tests on the plant. The first stage of the method applies to levels only: the vessel residence time is calculated from equipment dimensions and used to derive a set of preliminary proportional-integral (PI) settings with IMC. The second stage, re-tuning in closed-loop, applies to levels as well as other integrating processes: a tuning correction mechanism has been developed based on a series of closed-loop simulations with model errors. The tuning correction is done from a simple closed-loop step test and the application of a generic correlation between observed overshoot and integral time correction. A spin-off of the method is that an estimate of the vessel residence time (levels) or open-loop process gain (other integrating process) is obtained from the closed-loop data.

Keywords: closed-loop model identification, IMC-PID tuning method, integrating process control, on-line PID tuning adaptation

Procedia PDF Downloads 206
28354 Oil Extraction from Microalgae Dunalliela sp. by Polar and Non-Polar Solvents

Authors: A. Zonouzi, M. Auli, M. Javanmard Dakheli, M. A. Hejazi

Abstract:

Microalgae are tiny photosynthetic plants. Nowadays, microalgae are being used as nutrient-dense foods and sources of fine chemicals. They have significant amounts of lipid, carotenoids, vitamins, protein, minerals, chlorophyll, and pigments. Oil extraction from algae is a hotly debated topic currently because introducing an efficient method could decrease the process cost. This can determine the sustainability of algae-based foods. Scientific research works show that solvent extraction using chloroform/methanol (2:1) mixture is one of the efficient methods for oil extraction from algal cells, but both methanol and chloroform are toxic solvents, and therefore, the extracted oil will not be suitable for food application. In this paper, the effect of two food grade solvents (hexane and hexane/ isopropanol) on oil extraction yield from microalgae Dunaliella sp. was investigated and the results were compared with chloroform/methanol (2:1) extraction yield. It was observed that the oil extraction yield using hexane, hexane/isopropanol (3:2) and chloroform/methanol (2:1) mixture were 5.4, 13.93, and 17.5 (% w/w, dry basis), respectively. The fatty acid profile derived from GC illustrated that the palmitic (36.62%), oleic (18.62%), and stearic acids (19.08%) form the main portion of fatty acid composition of microalgae Dunalliela sp. oil. It was concluded that, the addition of isopropanol as polar solvent could increase the extraction yield significantly. Isopropanol solves cell wall phospholipids and enhances the release of intercellular lipids, which improves accessing of hexane to fatty acids.

Keywords: fatty acid profile‎, microalgae‎, oil extraction‎, polar solvent‎

Procedia PDF Downloads 357
28353 Determining the Width and Depths of Cut in Milling on the Basis of a Multi-Dexel Model

Authors: Jens Friedrich, Matthias A. Gebele, Armin Lechler, Alexander Verl

Abstract:

Chatter vibrations and process instabilities are the most important factors limiting the productivity of the milling process. Chatter can leads to damage of the tool, the part or the machine tool. Therefore, the estimation and prediction of the process stability is very important. The process stability depends on the spindle speed, the depth of cut and the width of cut. In milling, the process conditions are defined in the NC-program. While the spindle speed is directly coded in the NC-program, the depth and width of cut are unknown. This paper presents a new simulation based approach for the prediction of the depth and width of cut of a milling process. The prediction is based on a material removal simulation with an analytically represented tool shape and a multi-dexel approach for the work piece. The new calculation method allows the direct estimation of the depth and width of cut, which are the influencing parameters of the process stability, instead of the removed volume as existing approaches do. The knowledge can be used to predict the stability of new, unknown parts. Moreover with an additional vibration sensor, the stability lobe diagram of a milling process can be estimated and improved based on the estimated depth and width of cut.

Keywords: dexel, process stability, material removal, milling

Procedia PDF Downloads 510
28352 Novel Wound Healing Biodegradable Patch of Bioactive

Authors: Abhay Asthana, Shally Toshkhani, Gyati Shilakari

Abstract:

The present research was aimed to develop a biodegradable dermal patch formulation for wound healing in a novel, sustained and systematic manner. The goal is to reduce the frequency of dressings with improved drug delivery and thereby enhance therapeutic performance. In present study optimized formulation was designed using component polymers and excipients (e.g. Hydroxypropyl methyl cellulose, Ethylcellulose, and Gelatin) to impart significant folding endurance, elasticity and strength. Gelatin was used to get a mixture using ethylene glycol. Chitosan dissolved in suitable medium was mixed with stirring to gelatin mixture. With continued stirring to the mixture Curcumin was added in optimized ratio to get homogeneous dispersion. Polymers were dispersed with stirring in final formulation. The mixture was sonicated casted to get the film form. All steps were carried out under under strict aseptic conditions. The final formulation was a thin uniformly smooth textured film with dark brown-yellow color. The film was found to have folding endurance was around 20 to 21 times without a crack in an optimized formulation at RT (23C). The drug content was in range 96 to 102% and it passed the content uniform test. The final moisture content of the optimized formulation film was NMT 9.0%. The films passed stability study conducted at refrigerated conditions (4±0.2C) and at room temperature (23 ± 2C) for 30 days. Further, the drug content and texture remained undisturbed with stability study conducted at RT 23±2C for 45 and 90 days. Percentage cumulative drug release was found to be 80% in 12 h and matched the biodegradation rate as drug release with correlation factor R2 > 0.9. The film based formulation developed shows promising results in terms of stability and release profiles.

Keywords: biodegradable, patch, bioactive, polymer

Procedia PDF Downloads 504
28351 Investigation of Heat Transfer Mechanism Inside Shell and Tube Latent Heat Thermal Energy Storage Systems

Authors: Saeid Seddegh, Xiaolin Wang, Alan D. Henderson, Dong Chen, Oliver Oims

Abstract:

The main objective of this research is to study the heat transfer processes and phase change behaviour of a phase change material (PCM) in shell and tube latent heat thermal energy storage (LHTES) systems. The thermal behaviour in a vertical and horizontal shell-and-tube heat energy storage system using a pure thermal conduction model and a combined conduction-convection heat transfer model is compared in this paper. The model is first validated using published experimental data available in literature and then used to study the temperature variation, solid-liquid interface, phase distribution, total melting and solidification time during melting and solidification processes of PCMs. The simulated results show that the combined convection and conduction model can better describe the energy transfer in PCMs during melting process. In contrast, heat transfer by conduction is more significant during the solidification process since the two models show little difference. Also, it was concluded that during the charging process for the horizontal orientation, convective heat transfer has a strong effect on melting of the upper part of the solid PCM and is less significant during melting of the lower half of the solid PCM. However, in the vertical orientation, convective heat transfer is the same active during the entire charging process. In the solidification process, the thermal behavior does not show any difference between horizontal and vertical systems.

Keywords: latent heat thermal energy storage, phase change material, natural convection, melting, shell and tube heat exchanger, melting, solidification

Procedia PDF Downloads 545
28350 The Forensic Swing of Things: The Current Legal and Technical Challenges of IoT Forensics

Authors: Pantaleon Lutta, Mohamed Sedky, Mohamed Hassan

Abstract:

The inability of organizations to put in place management control measures for Internet of Things (IoT) complexities persists to be a risk concern. Policy makers have been left to scamper in finding measures to combat these security and privacy concerns. IoT forensics is a cumbersome process as there is no standardization of the IoT products, no or limited historical data are stored on the devices. This paper highlights why IoT forensics is a unique adventure and brought out the legal challenges encountered in the investigation process. A quadrant model is presented to study the conflicting aspects in IoT forensics. The model analyses the effectiveness of forensic investigation process versus the admissibility of the evidence integrity; taking into account the user privacy and the providers’ compliance with the laws and regulations. Our analysis concludes that a semi-automated forensic process using machine learning, could eliminate the human factor from the profiling and surveillance processes, and hence resolves the issues of data protection (privacy and confidentiality).

Keywords: cloud forensics, data protection Laws, GDPR, IoT forensics, machine Learning

Procedia PDF Downloads 141
28349 Automatic Queuing Model Applications

Authors: Fahad Suleiman

Abstract:

Queuing, in medical system is the process of moving patients in a specific sequence to a specific service according to the patients’ nature of illness. The term scheduling stands for the process of computing a schedule. This may be done by a queuing based scheduler. This paper focuses on the medical consultancy system, the different queuing algorithms that are used in healthcare system to serve the patients, and the average waiting time. The aim of this paper is to build automatic queuing system for organizing the medical queuing system that can analyses the queue status and take decision which patient to serve. The new queuing architecture model can switch between different scheduling algorithms according to the testing results and the factor of the average waiting time. The main innovation of this work concerns the modeling of the average waiting time is taken into processing, in addition with the process of switching to the scheduling algorithm that gives the best average waiting time.

Keywords: queuing systems, queuing system models, scheduling algorithms, patients

Procedia PDF Downloads 338
28348 The Six 'P' Model: Principles of Inclusive Practice for Inclusion Coaches

Authors: Tiffany Gallagher, Sheila Bennett

Abstract:

Based on data from a larger study, this research is based in a small school district in Ontario, Canada, that has made a transition from self-contained classes for students with exceptionalities to inclusive classroom placements for all students with their age-appropriate peers. The school board aided this transition by hiring Inclusion Coaches with a background in special education to work alongside teachers as partners and inform their inclusive practice. Based on qualitative data from four focus groups conducted with Inclusion Coaches, as well as four blog-style reflections collected at various points over two years, six principles of inclusive practice were identified for coaches. The six principles form a model during transition: pre-requisite, process, precipice, promotion, proof, and promise. These principles are encapsulated in a visual model of a spiraling staircase displaying the conditions that exist prior to coaching, during coaching interactions and considerations for the sustainability of coaching. These six principles are re-iterative and should be re-visited each time a coaching interaction is initiated. Exploring inclusion coaching as a model emulates coaching in other contexts and allows us to examine an established process through a new lens. This research becomes increasingly important as more school boards transition toward inclusive classrooms, The Six ‘P’ Model: Principles of Inclusive Practice for Inclusion Coaches allows for a unique look into a scaffolding model of building educator capacity in an inclusive setting.

Keywords: capacity building, coaching, inclusion, special education

Procedia PDF Downloads 235
28347 Understanding the Importance of Participation in the City Planning Process and Its Influencing Factors

Authors: Louis Nwachi

Abstract:

Urban planning systems in most countries still rely on expert-driven, top-down technocratic plan-making processes rather than a public and people-led process. This paper set out to evaluate the need for public participation in the plan-making process and to highlight the factors that affect public participation in the plan-making process. In doing this, it adopted a qualitative approach based on document review and interviews taken from real-world phenomena. A case study strategy using the Metropolitan Area of Abuja, the capital of Nigeria, as the study sample was used in carrying out the research. The research finds that participation is an important tool in the plan-making process and that public engagement in the process contributes to the identification of key urban issues that are unique to the specific local areas, thereby contributing to the establishment of priorities and, in turn, to the mobilization of resources to meet the identified needs. It also finds that the development of a participation model by city authorities encourages public engagement and helps to develop trust between those in authority and the different key stakeholder groups involved in the plan-making process.

Keywords: plan-making, participation, urban planning, city

Procedia PDF Downloads 85
28346 A Closed-Loop Design Model for Sustainable Manufacturing by Integrating Forward Design and Reverse Design

Authors: Yuan-Jye Tseng, Yi-Shiuan Chen

Abstract:

In this paper, a new concept of closed-loop design model is presented. The closed-loop design model is developed by integrating forward design and reverse design. Based on this new concept, a closed-loop design model for sustainable manufacturing by integrated evaluation of forward design, reverse design, and green manufacturing using a fuzzy analytic network process is developed. In the design stage of a product, with a given product requirement and objective, there can be different ways to design the detailed components and specifications. Therefore, there can be different design cases to achieve the same product requirement and objective. Thus, in the design evaluation stage, it is required to analyze and evaluate the different design cases. The purpose of this research is to develop a model for evaluating the design cases by integrated evaluation of forward design, reverse design, and green manufacturing models. A fuzzy analytic network process model is presented for integrated evaluation of the criteria in the three models. The comparison matrices for evaluating the criteria in the three groups are established. The total relational values among the three groups represent the total relational effects. In application, a super matrix can be created and the total relational values can be used to evaluate the design cases for decision-making to select the final design case. An example product is demonstrated in this presentation. It shows that the model is useful for integrated evaluation of forward design, reverse design, and green manufacturing to achieve a closed-loop design for sustainable manufacturing objective.

Keywords: design evaluation, forward design, reverse design, closed-loop design, supply chain management, closed-loop supply chain, fuzzy analytic network process

Procedia PDF Downloads 659
28345 Effect of Cocoa Pod Ash and Poultry Manure on Soil Properties and Cocoyam Productivity of Nutrient-Depleted Tropical Alfisol

Authors: T. M. Agbede, A. O. Adekiya

Abstract:

An experiment was carried out for three consecutive years at Owo, southwest Nigeria. The objective of the investigation was to determine the effect of Cocoa Pod Ash (CPA) and Poultry Manure (PM) applied solely and their combined form, as sources of fertilizers on soil properties, leaf nutrient composition, growth and yield of cocoyam. Three soil amendments: CPA, PM (sole forms), CPA and PM (mixture), were applied at 7.5 t ha-1 with an inorganic fertilizer (NPK 15-15-15) at 400 kg ha-1 as a reference and a natural soil fertility, NSF (control), arranged in a randomized complete block design with three replications. Results showed that soil amendments significantly increased (p = 0.05) corm and cormel weights and growth of cocoyam, soil and leaf N, P, K, Ca and Mg, soil pH and organic carbon (OC) concentrations compared with the NSF (control). The mixture of CPA+PM treatment increased corm and cormel weights, plant height and leaf area of cocoyam by 40, 39, 42, and 48%, respectively, compared with inorganic fertilizer (NPK) and 13, 12, 15 and 7%, respectively, compared with PM alone. Sole or mixed forms of soil amendments showed remarkable improvement in soil physical properties compared with NPK and the NSF (control). The mixture of CPA+PM applied at 7.5 t ha-1 was the most effective treatment in improving cocoyam yield and growth parameters, soil and leaf nutrient composition.

Keywords: Cocoa pod ash, cocoyam, poultry manure, soil and leaf nutrient composition.

Procedia PDF Downloads 355
28344 Application of Griddization Management to Construction Hazard Management

Authors: Lingzhi Li, Jiankun Zhang, Tiantian Gu

Abstract:

Hazard management that can prevent fatal accidents and property losses is a fundamental process during the buildings’ construction stage. However, due to lack of safety supervision resources and operational pressures, the conduction of hazard management is poor and ineffective in China. In order to improve the quality of construction safety management, it is critical to explore the use of information technologies to ensure that the process of hazard management is efficient and effective. After exploring the existing problems of construction hazard management in China, this paper develops the griddization management model for construction hazard management. First, following the knowledge grid infrastructure, the griddization computing infrastructure for construction hazards management is designed which includes five layers: resource entity layer, information management layer, task management layer, knowledge transformation layer and application layer. This infrastructure will be as the technical support for realizing grid management. Second, this study divides the construction hazards into grids through city level, district level and construction site level according to grid principles. Last, a griddization management process including hazard identification, assessment and control is developed. Meanwhile, all stakeholders of construction safety management, such as owners, contractors, supervision organizations and government departments, should take the corresponding responsibilities in this process. Finally, a case study based on actual construction hazard identification, assessment and control is used to validate the effectiveness and efficiency of the proposed griddization management model. The advantage of this designed model is to realize information sharing and cooperative management between various safety management departments.

Keywords: construction hazard, griddization computing, grid management, process

Procedia PDF Downloads 262
28343 Investigated Optimization of Davidson Path Loss Model for Digital Terrestrial Television (DTTV) Propagation in Urban Area

Authors: Pitak Keawbunsong, Sathaporn Promwong

Abstract:

This paper presents an investigation on the efficiency of the optimized Davison path loss model in order to look for a suitable path loss model to design and planning DTTV propagation for small and medium urban areas in southern Thailand. Hadyai City in Songkla Province is chosen as the case study to collect the analytical data on the electric field strength. The optimization is conducted through the least square method while the efficiency index is through the statistical value of relative error (RE). The result of the least square method is the offset and slop of the frequency to be used in the optimized process. The statistical result shows that RE of the old Davidson model is at the least when being compared with the optimized Davison and the Hata models. Thus, the old Davison path loss model is the most accurate that further becomes the most optimized for the plan on the propagation network design.

Keywords: DTTV propagation, path loss model, Davidson model, least square method

Procedia PDF Downloads 326
28342 General Purpose Graphic Processing Units Based Real Time Video Tracking System

Authors: Mallikarjuna Rao Gundavarapu, Ch. Mallikarjuna Rao, K. Anuradha Bai

Abstract:

Real Time Video Tracking is a challenging task for computing professionals. The performance of video tracking techniques is greatly affected by background detection and elimination process. Local regions of the image frame contain vital information of background and foreground. However, pixel-level processing of local regions consumes a good amount of computational time and memory space by traditional approaches. In our approach we have explored the concurrent computational ability of General Purpose Graphic Processing Units (GPGPU) to address this problem. The Gaussian Mixture Model (GMM) with adaptive weighted kernels is used for detecting the background. The weights of the kernel are influenced by local regions and are updated by inter-frame variations of these corresponding regions. The proposed system has been tested with GPU devices such as GeForce GTX 280, GeForce GTX 280 and Quadro K2000. The results are encouraging with maximum speed up 10X compared to sequential approach.

Keywords: connected components, embrace threads, local weighted kernel, structuring elements

Procedia PDF Downloads 424
28341 Identifying and Analyzing the Role of Brand Loyalty towards Incumbent Smartphones in New Branded Smartphone Adoption: Approach by Dual Process Theory

Authors: Lee Woong-Kyu

Abstract:

Fierce competition in smartphone market may encourage users to switch brands when buying a new smartphone. However, many smartphone users continue to use the same brand although other branded smartphones are perceived to be more attractive. The purpose of this study is to identify and analyze the effects of brand loyalty toward incumbent smartphone on new smartphone adoption. For this purpose, a research model including two hypotheses, the positive effect on rational judgments and the negative effect on rational judgments, are proposed based on the dual process theory. For the validation of the research model, the data was collected by surveying Korean university students and tested by the group comparison between high and low brand loyalty. The results show that the two hypotheses were statistically supported.

Keywords: brand loyalty, dual process theory, incumbent smartphone, smartphone adoption

Procedia PDF Downloads 281
28340 Developing a Process and Cost Model for Xanthan Biosynthesis from Bioethanol Production Waste Effluents

Authors: Bojana Ž. Bajić, Damjan G. Vučurović, Siniša N. Dodić, Jovana A. Grahovac, Jelena M. Dodić

Abstract:

Biosynthesis of xanthan, a microbial polysaccharide produced by Xanthomonas campestris, is characterized by the possibility of using non-specific carbohydrate substrates, which means different waste effluents can be used as a basis for the production media. Potential raw material sources for xanthan production come from industries with large amounts of waste effluents that are rich in compounds necessary for microorganism growth and multiplication. Taking into account the amount of waste effluents generated by the bioethanol industry and the fact that it contains a high inorganic and organic load it is clear that they represent a potential environmental pollutants if not properly treated. For this reason, it is necessary to develop new technologies which use wastes and wastewaters of one industry as raw materials for another industry. The result is not only a new product, but also reduction of pollution and environmental protection. Biotechnological production of xanthan, which consists of using biocatalysts to convert the bioethanol waste effluents into a high-value product, presents a possibility for sustainable development. This research uses scientific software developed for the modeling of biotechnological processes in order to design a xanthan production plant from bioethanol production waste effluents as raw material. The model was developed using SuperPro Designer® by using input data such as the composition of raw materials and products, defining unit operations, utility consumptions, etc., while obtaining capital and operating costs and the revenues from products to create a baseline production plant model. Results from this baseline model can help in the development of novel biopolymer production technologies. Additionally, a detailed economic analysis showed that this process for converting waste effluents into a high value product is economically viable. Therefore, the proposed model represents a useful tool for scaling up the process from the laboratory or pilot plant to a working industrial scale plant.

Keywords: biotechnology, process model, xanthan, waste effluents

Procedia PDF Downloads 337
28339 How Envisioning Process Is Constructed: An Exploratory Research Comparing Three International Public Televisions

Authors: Alexandre Bedard, Johane Brunet, Wendellyn Reid

Abstract:

Public Television is constantly trying to maintain and develop its audience. And to achieve those goals, it needs a strong and clear vision. Vision or envision is a multidimensional process; it is simultaneously a conduit that orients and fixes the future, an idea that comes before the strategy and a mean by which action is accomplished, from a business perspective. Also, vision is often studied from a prescriptive and instrumental manner. Based on our understanding of the literature, we were able to explain how envisioning, as a process, is a creative one; it takes place in the mind and uses wisdom and intelligence through a process of evaluation, analysis and creation. Through an aggregation of the literature, we build a model of the envisioning process, based on past experiences, perceptions and knowledge and influenced by the context, being the individual, the organization and the environment. With exploratory research in which vision was deciphered through the discourse, through a qualitative and abductive approach and a grounded theory perspective, we explored three extreme cases, with eighteen interviews with experts, leaders, politicians, actors of the industry, etc. and more than twenty hours of interviews in three different countries. We compared the strategy, the business model, and the political and legal forces. We also looked at the history of each industry from an inertial point of view. Our analysis of the data revealed that a legitimacy effect due to the audience, the innovation and the creativity of the institutions was at the cornerstone of what would influence the envisioning process. This allowed us to identify how different the process was for Canadian, French and UK public broadcasters, although we concluded that the three of them had a socially constructed vision for their future, based on stakeholder management and an emerging role for the managers: ideas brokers.

Keywords: envisioning process, international comparison, television, vision

Procedia PDF Downloads 119
28338 Efficiency of Membrane Distillation to Produce Fresh Water

Authors: Sabri Mrayed, David Maccioni, Greg Leslie

Abstract:

Seawater desalination has been accepted as one of the most effective solutions to the growing problem of a diminishing clean drinking water supply. Currently, two desalination technologies dominate the market – the thermally driven multi-stage flash distillation (MSF) and the membrane based reverse osmosis (RO). However, in recent years membrane distillation (MD) has emerged as a potential alternative to the established means of desalination. This research project intended to determine the viability of MD as an alternative process to MSF and RO for seawater desalination. Specifically the project involves conducting a thermodynamic analysis of the process based on the second law of thermodynamics to determine the efficiency of the MD. Data was obtained from experiments carried out on a laboratory rig. In order to determine exergy values required for the exergy analysis, two separate models were built in Engineering Equation Solver – the ’Minimum Separation Work Model’ and the ‘Stream Exergy Model’. The efficiency of MD process was found to be 17.3 %, and the energy consumption was determined to be 4.5 kWh to produce one cubic meter of fresh water. The results indicate MD has potential as a technique for seawater desalination compared to RO and MSF. However, it was shown that this was only the case if an alternate energy source such as green or waste energy was available to provide the thermal energy input to the process. If the process was required to power itself, it was shown to be highly inefficient and in no way thermodynamically viable as a commercial desalination process.

Keywords: desalination, exergy, membrane distillation, second law efficiency

Procedia PDF Downloads 351