Search results for: gaussian process
14896 Uncertainty Quantification of Fuel Compositions on Premixed Bio-Syngas Combustion at High-Pressure
Abstract:
Effect of fuel variabilities on premixed combustion of bio-syngas mixtures is of great importance in bio-syngas utilisation. The uncertainties of concentrations of fuel constituents such as H2, CO and CH4 may lead to unpredictable combustion performances, combustion instabilities and hot spots which may deteriorate and damage the combustion hardware. Numerical modelling and simulations can assist in understanding the behaviour of bio-syngas combustion with pre-defined species concentrations, while the evaluation of variabilities of concentrations is expensive. To be more specific, questions such as ‘what is the burning velocity of bio-syngas at specific equivalence ratio?’ have been answered either experimentally or numerically, while questions such as ‘what is the likelihood of burning velocity when precise concentrations of bio-syngas compositions are unknown, but the concentration ranges are pre-described?’ have not yet been answered. Uncertainty quantification (UQ) methods can be used to tackle such questions and assess the effects of fuel compositions. An efficient probabilistic UQ method based on Polynomial Chaos Expansion (PCE) techniques is employed in this study. The method relies on representing random variables (combustion performances) with orthogonal polynomials such as Legendre or Gaussian polynomials. The constructed PCE via Galerkin Projection provides easy access to global sensitivities such as main, joint and total Sobol indices. In this study, impacts of fuel compositions on combustion (adiabatic flame temperature and laminar flame speed) of bio-syngas fuel mixtures are presented invoking this PCE technique at several equivalence ratios. High-pressure effects on bio-syngas combustion instability are obtained using detailed chemical mechanism - the San Diego Mechanism. Guidance on reducing combustion instability from upstream biomass gasification process is provided by quantifying the significant contributions of composition variations to variance of physicochemical properties of bio-syngas combustion. It was found that flame speed is very sensitive to hydrogen variability in bio-syngas, and reducing hydrogen uncertainty from upstream biomass gasification processes can greatly reduce bio-syngas combustion instability. Variation of methane concentration, although thought to be important, has limited impacts on laminar flame instabilities especially for lean combustion. Further studies on the UQ of percentage concentration of hydrogen in bio-syngas can be conducted to guide the safer use of bio-syngas.Keywords: bio-syngas combustion, clean energy utilisation, fuel variability, PCE, targeted uncertainty reduction, uncertainty quantification
Procedia PDF Downloads 27514895 Lean Manufacturing Implementation in Fused Plastic Bags Industry
Authors: Tareq Issa
Abstract:
Lean manufacturing is concerned with the implementation of several tools and methodologies that aim for the continuous elimination of wastes throughout manufacturing process flow in the production system. This research addresses the implementation of lean principles and tools in a small-medium industry focusing on 'fused' plastic bags production company in Amman, Jordan. In this production operation, the major type of waste to eliminate include material, waiting-transportation, and setup wastes. The primary goal is to identify and implement selected lean strategies to eliminate waste in the manufacturing process flow. A systematic approach was used for the implementation of lean principles and techniques, through the application of Value Stream Mapping analysis. The current state value stream map was constructed to improve the plastic bags manufacturing process through identifying opportunities to eliminate waste and its sources. Also, the future-state value stream map was developed describing improvements in the overall manufacturing process resulting from eliminating wastes. The implementation of VSM, 5S, Kanban, Kaizen, and Reduced lot size methods have provided significant benefits and results. Productivity has increased to 95.4%, delivery schedule attained at 99-100%, reduction in total inventory to 1.4 days and the setup time for the melting process was reduced to about 30 minutes.Keywords: lean implementation, plastic bags industry, value stream map, process flow
Procedia PDF Downloads 17514894 The Using of Smart Power Concepts in Military Targeting Process
Authors: Serdal AKYUZ
Abstract:
The smart power is the use of soft and hard power together in consideration of existing circumstances. Soft power can be defined as the capability of changing perception of any target mass by employing policies based on legality. The hard power, generally, uses military and economic instruments which are the concrete indicator of general power comprehension. More than providing a balance between soft and hard power, smart power creates a proactive combination by assessing existing resources. Military targeting process (MTP), as stated in smart power methodology, benefits from a wide scope of lethal and non-lethal weapons to reach intended end state. The Smart powers components can be used in military targeting process similar to using of lethal or non-lethal weapons. This paper investigates the current use of Smart power concept, MTP and presents a new approach to MTP from smart power concept point of view.Keywords: future security environment, hard power, military targeting process, soft power, smart power
Procedia PDF Downloads 47514893 Enhancement of MIMO H₂S Gas Sweetening Separator Tower Using Fuzzy Logic Controller Array
Authors: Muhammad M. A. S. Mahmoud
Abstract:
Natural gas sweetening process is a controlled process that must be done at maximum efficiency and with the highest quality. In this work, due to complexity and non-linearity of the process, the H₂S gas separation and the intelligent fuzzy controller, which is used to enhance the process, are simulated in MATLAB – Simulink. The new design of fuzzy control for Gas Separator is discussed in this paper. The design is based on the utilization of linear state-estimation to generate the internal knowledge-base that stores input-output pairs. The obtained input/output pairs are then used to design a feedback fuzzy controller. The proposed closed-loop fuzzy control system maintains the system asymptotically-stability while it enhances the system time response to achieve better control of the concentration of the output gas from the tower. Simulation studies are carried out to illustrate the Gas Separator system performance.Keywords: gas separator, gas sweetening, intelligent controller, fuzzy control
Procedia PDF Downloads 47114892 A Tool for Assessing Performance and Structural Quality of Business Process
Authors: Mariem Kchaou, Wiem Khlif, Faiez Gargouri
Abstract:
Modeling business processes is an essential task when evaluating, improving, or documenting existing business processes. To be efficient in such tasks, a business process model (BPM) must have high structural quality and high performance. Evidently, evaluating the performance of a business process model is a necessary step to reduce time, cost, while assessing the structural quality aims to improve the understandability and the modifiability of the BPMN model. To achieve these objectives, a set of structural and performance measures have been proposed. Since the diversity of measures, we propose a framework that integrates both structural and performance aspects for classifying them. Our measure classification is based on business process model perspectives (e.g., informational, functional, organizational, behavioral, and temporal), and the elements (activity, event, actor, etc.) involved in computing the measures. Then, we implement this framework in a tool assisting the structural quality and the performance of a business process. The tool helps the designers to select an appropriate subset of measures associated with the corresponding perspective and to calculate and interpret their values in order to improve the structural quality and the performance of the model.Keywords: performance, structural quality, perspectives, tool, classification framework, measures
Procedia PDF Downloads 15614891 The Use of Artificial Intelligence to Harmonization in the Lawmaking Process
Authors: Supriyadi, Andi Intan Purnamasari, Aminuddin Kasim, Sulbadana, Mohammad Reza
Abstract:
The development of the Industrial Revolution Era 4.0 brought a significant influence in the administration of countries in all parts of the world, including Indonesia, not only in the administration and economic sectors but the ways and methods of forming laws should also be adjusted. Until now, the process of making laws carried out by the Parliament with the Government still uses the classical method. The law-making process still uses manual methods, such as typing harmonization of regulations, so that it is not uncommon for errors to occur, such as writing errors, copying articles and so on, things that require a high level of accuracy and relying on inventory and harmonization carried out manually by humans. However, this method often creates several problems due to errors and inaccuracies on the part of officers who harmonize laws after discussion and approval; this has a very serious impact on the system of law formation in Indonesia. The use of artificial intelligence in the process of forming laws seems to be justified and becomes the answer in order to minimize the disharmony of various laws and regulations. This research is normative research using the Legislative Approach and the Conceptual Approach. This research focuses on the question of how to use Artificial Intelligence for Harmonization in the Lawmaking Process.Keywords: artificial intelligence, harmonization, laws, intelligence
Procedia PDF Downloads 16114890 Inadequate Requirements Engineering Process: A Key Factor for Poor Software Development in Developing Nations: A Case Study
Authors: K. Adu Michael, K. Alese Boniface
Abstract:
Developing a reliable and sustainable software products is today a big challenge among up–coming software developers in Nigeria. The inability to develop a comprehensive problem statement needed to execute proper requirements engineering process is missing. The need to describe the ‘what’ of a system in one document, written in a natural language is a major step in the overall process of Software Engineering. Requirements Engineering is a process use to discover, analyze and validate system requirements. This process is needed in reducing software errors at the early stage of the development of software. The importance of each of the steps in Requirements Engineering is clearly explained in the context of using detailed problem statement from client/customer to get an overview of an existing system along with expectations from the new system. This paper elicits inadequate Requirements Engineering principle as the major cause of poor software development in developing nations using a case study of final year computer science students of a tertiary-education institution in Nigeria.Keywords: client/customer, problem statement, requirements engineering, software developers
Procedia PDF Downloads 40514889 Modelling and Optimization of Laser Cutting Operations
Authors: Hany Mohamed Abdu, Mohamed Hassan Gadallah, El-Giushi Mokhtar, Yehia Mahmoud Ismail
Abstract:
Laser beam cutting is one nontraditional machining process. This paper optimizes the parameters of Laser beam cutting machining parameters of Stainless steel (316L) by considering the effect of input parameters viz. power, oxygen pressure, frequency and cutting speed. Statistical design of experiments are carried in three different levels and process responses such as 'Average kerf taper (Ta)' and 'Surface Roughness (Ra)' are measured accordingly. A quadratic mathematical model (RSM) for each of the responses is developed as a function of the process parameters. Responses predicted by the models (as per Taguchi’s L27 OA) are employed to search for an optimal parametric combination to achieve desired yield of the process. RSM models are developed for mean responses, S/N ratio, and standard deviation of responses. Optimization models are formulated as single objective problem subject to process constraints. Models are formulated based on Analysis of Variance (ANOVA) using MATLAB environment. Optimum solutions are compared with Taguchi Methodology results.Keywords: optimization, laser cutting, robust design, kerf width, Taguchi method, RSM and DOE
Procedia PDF Downloads 61914888 Probing Multiple Relaxation Process in Zr-Cu Base Alloy Using Mechanical Spectroscopy
Authors: A. P. Srivastava, D. Srivastava, D. J. Browne
Abstract:
Relaxation dynamics of Zr44Cu40Al8Ag8 bulk metallic glass (BMG) has been probed using dynamic mechanical analyzer. The BMG sample was casted in the form of a plate of dimension 55 mm x 40 mm x 3 mm using tilt casting technique. X-ray diffraction and transmission electron microscope have been used for the microstructural characterization of as-cast BMG. For the mechanical spectroscopy study, samples in the form of a bar of size 55 mm X 2 mm X 3 mm were machined from the BMG plate. The mechanical spectroscopy was performed on dynamic mechanical analyzer (DMA) by 50 mm 3-point bending method in a nitrogen atmosphere. It was observed that two glass transition process were competing in supercooled liquid region around temperature 390°C and 430°C. The supercooled liquid state was completely characterized using DMA and differential scanning calorimeter (DSC). In addition to the main α-relaxation process, presence of β relaxation process around temperature 360°C; below the glass transition temperature was also observed. The β relaxation process could be described by Arrhenius law with the activation energy of 160 kJ/mole. The volume of the flow unit associated with this relaxation process has been estimated. The results from DMA study has been used to characterize the shear transformation zone in terms of activation volume and size. High fragility parameter value of 34 and higher activation volume indicates that this alloy could show good plasticity in supercooled liquid region. The possible mechanism for the relaxation processes has been discussed.Keywords: DMA, glass transition, metallic glass, thermoplastic forming
Procedia PDF Downloads 29514887 Tool Condition Monitoring of Ceramic Inserted Tools in High Speed Machining through Image Processing
Authors: Javier A. Dominguez Caballero, Graeme A. Manson, Matthew B. Marshall
Abstract:
Cutting tools with ceramic inserts are often used in the process of machining many types of superalloy, mainly due to their high strength and thermal resistance. Nevertheless, during the cutting process, the plastic flow wear generated in these inserts enhances and propagates cracks due to high temperature and high mechanical stress. This leads to a very variable failure of the cutting tool. This article explores the relationship between the continuous wear that ceramic SiAlON (solid solutions based on the Si3N4 structure) inserts experience during a high-speed machining process and the evolution of sparks created during the same process. These sparks were analysed through pictures of the cutting process recorded using an SLR camera. Features relating to the intensity and area of the cutting sparks were extracted from the individual pictures using image processing techniques. These features were then related to the ceramic insert’s crater wear area.Keywords: ceramic cutting tools, high speed machining, image processing, tool condition monitoring, tool wear
Procedia PDF Downloads 29814886 Rounded-off Measurements and Their Implication on Control Charts
Authors: Ran Etgar
Abstract:
The process of rounding off measurements in continuous variables is commonly encountered. Although it usually has minor effects, sometimes it can lead to poor outcomes in statistical process control using X ̅-chart. The traditional control limits can cause incorrect conclusions if applied carelessly. This study looks into the limitations of classical control limits, particularly the impact of asymmetry. An approach to determining the distribution function of the measured parameter (Y ̅) is presented, resulting in a more precise method to establish the upper and lower control limits. The proposed method, while slightly more complex than Shewhart's original idea, is still user-friendly and accurate and only requires the use of two straightforward tables.Keywords: inaccurate measurement, SPC, statistical process control, rounded-off, control chart
Procedia PDF Downloads 4014885 A Distributed Cryptographically Generated Address Computing Algorithm for Secure Neighbor Discovery Protocol in IPv6
Authors: M. Moslehpour, S. Khorsandi
Abstract:
Due to shortage in IPv4 addresses, transition to IPv6 has gained significant momentum in recent years. Like Address Resolution Protocol (ARP) in IPv4, Neighbor Discovery Protocol (NDP) provides some functions like address resolution in IPv6. Besides functionality of NDP, it is vulnerable to some attacks. To mitigate these attacks, Internet Protocol Security (IPsec) was introduced, but it was not efficient due to its limitation. Therefore, SEND protocol is proposed to automatic protection of auto-configuration process. It is secure neighbor discovery and address resolution process. To defend against threats on NDP’s integrity and identity, Cryptographically Generated Address (CGA) and asymmetric cryptography are used by SEND. Besides advantages of SEND, its disadvantages like the computation process of CGA algorithm and sequentially of CGA generation algorithm are considerable. In this paper, we parallel this process between network resources in order to improve it. In addition, we compare the CGA generation time in self-computing and distributed-computing process. We focus on the impact of the malicious nodes on the CGA generation time in the network. According to the result, although malicious nodes participate in the generation process, CGA generation time is less than when it is computed in a one-way. By Trust Management System, detecting and insulating malicious nodes is easier.Keywords: NDP, IPsec, SEND, CGA, modifier, malicious node, self-computing, distributed-computing
Procedia PDF Downloads 27814884 Human Action Recognition Using Wavelets of Derived Beta Distributions
Authors: Neziha Jaouedi, Noureddine Boujnah, Mohamed Salim Bouhlel
Abstract:
In the framework of human machine interaction systems enhancement, we focus throw this paper on human behavior analysis and action recognition. Human behavior is characterized by actions and reactions duality (movements, psychological modification, verbal and emotional expression). It’s worth noting that many information is hidden behind gesture, sudden motion points trajectories and speeds, many research works reconstructed an information retrieval issues. In our work we will focus on motion extraction, tracking and action recognition using wavelet network approaches. Our contribution uses an analysis of human subtraction by Gaussian Mixture Model (GMM) and body movement through trajectory models of motion constructed from kalman filter. These models allow to remove the noise using the extraction of the main motion features and constitute a stable base to identify the evolutions of human activity. Each modality is used to recognize a human action using wavelets of derived beta distributions approach. The proposed approach has been validated successfully on a subset of KTH and UCF sports database.Keywords: feautures extraction, human action classifier, wavelet neural network, beta wavelet
Procedia PDF Downloads 41114883 Systemic Functional Grammar Analysis of Barack Obama's Second Term Inaugural Speech
Authors: Sadiq Aminu, Ahmed Lamido
Abstract:
This research studies Barack Obama’s second inaugural speech using Halliday’s Systemic Functional Grammar (SFG). SFG is a text grammar which describes how language is used, so that the meaning of the text can be better understood. The primary source of data in this research work is Barack Obama’s second inaugural speech which was obtained from the internet. The analysis of the speech was based on the ideational and textual metafunctions of Systemic Functional Grammar. Specifically, the researcher analyses the Process Types and Participants (ideational) and the Theme/Rheme (textual). It was found that material process (process of doing) was the most frequently used ‘Process type’ and ‘We’ which refers to the people of America was the frequently used ‘Theme’. Application of the SFG theory, therefore, gives a better meaning to Barack Obama’s speech.Keywords: ideational, metafunction, rheme, textual, theme
Procedia PDF Downloads 15914882 How to Enhance Performance of Universities by Implementing Balanced Scorecard with Using FDM and ANP
Authors: Neda Jalaliyoon, Nooh Abu Bakar, Hamed Taherdoost
Abstract:
The present research recommended balanced scorecard (BSC) framework to appraise the performance of the universities. As the original model of balanced scorecard has four perspectives in order to implement BSC in present research the same model with “financial perspective”, “customer”,” internal process” and “learning and growth” is used as well. With applying fuzzy Delphi method (FDM) and questionnaire sixteen measures of performance were identified. Moreover, with using the analytic network process (ANP) the weights of the selected indicators were determined. Results indicated that the most important BSC’s aspect were Internal Process (0.3149), Customer (0.2769), Learning and Growth (0.2049), and Financial (0.2033) respectively. The proposed BSC framework can help universities to enhance their efficiency in competitive environment.Keywords: balanced scorecard, higher education, fuzzy delphi method, analytic network process (ANP)
Procedia PDF Downloads 42614881 Design and Implementation of LabVIEW Based Relay Autotuning Controller for Level Setup
Authors: Manoj M. Sarode, Sharad P. Jadhav, Mukesh D. Patil, Pushparaj S. Suryawanshi
Abstract:
Even though the PID controller is widely used in industrial process, tuning of PID parameters are not easy. It is a time consuming and requires expert people. Another drawback of PID controller is that process dynamics might change over time. This can happen due to variation of the process load, normal wear and tear etc. To compensate for process behavior change over time, expert users are required to recalibrate the PID gains. Implementation of model based controllers usually needs a process model. Identification of process model is time consuming job and no guaranty of model accuracy. If the identified model is not accurate, performance of the controller may degrade. Model based controllers are quite expensive and the whole procedure for the implementation is sometimes tedious. To eliminate such issues Autotuning PID controller becomes vital element. Software based Relay Feedback Autotuning Controller proves to be efficient, upgradable and maintenance free controller. In Relay Feedback Autotune controller PID parameters can be achieved with a very short span of time. This paper presents the real time implementation of LabVIEW based Relay Feedback Autotuning PID controller. It is successfully developed and implemented to control level of a laboratory setup. Its performance is analyzed for different setpoints and found satisfactorily.Keywords: autotuning, PID, liquid level control, recalibrate, labview, controller
Procedia PDF Downloads 39414880 Project Management Agile Model Based on Project Management Body of Knowledge Guideline
Authors: Mehrzad Abdi Khalife, Iraj Mahdavi
Abstract:
This paper presents the agile model for project management process. For project management process, the Project Management Body of Knowledge (PMBOK) guideline has been selected as platform. Combination of computational science and artificial intelligent methodology has been added to the guideline to transfer the standard to agile project management process. The model is the combination of practical standard, computational science and artificial intelligent. In this model, we present communication model and protocols to keep process agile. Here, we illustrate the collaboration man and machine in project management area with artificial intelligent approach.Keywords: artificial intelligent, conceptual model, man-machine collaboration, project management, standard
Procedia PDF Downloads 34114879 An Evaluation of Impact of Media on the Electoral Reform Process in Nigeria between 2010–2015
Authors: H. Shola Adeosun, D. Adeoye Odedeji, F. Ajoke Adebiyi
Abstract:
This study examines the impact of media on the electoral process in Nigeria and the roles played by the media in the reform process. Survey research method was adopted as research methodology, and this enables the researcher to use questionnaire, and oral interview to elicit primary data from the respondents was interpreted, analysed and interpreted with statistical tools such as tables, figures, and percentages. The hypothesis formulated were tested with chi-square. The findings revealed that there is significant relationship between the media and electoral reform process in the 2011 and 2015 general elections in Nigeria. The study recommends that electoral committee should implement virile electoral system with the peaceful voting environment. The media should intensify efforts to expose violation of electoral laws; media should play an advocacy role for dialogue and debate on the reform recommendations. The study recommends that media should unite the nation through their reports on peace, national security, national integration and ethnoreligious tolerance and that adequate training should be given to media practitioners on how to report issues relating to elections.Keywords: evaluation, impact, media, electoral reform process
Procedia PDF Downloads 28814878 Public Participation as a Social Inclusion Tool in the Urban Planning Process: A Case Study of Abuja, Nigeria
Authors: Nwachi Prosper Louis, Cynthia Ogonna Ikesee
Abstract:
The urban planning system of cities varies by country, but in general, it is an instrument for establishing long-term sustainable frameworks and plans for social, institutional and economic development. There is limited knowledge, development, and implementation of effective and sustainable urban planning structures and plans that encourage social inclusion in most communities. This has led to social, economic and environmental deficiencies resulting in community isolation and segregation in class, ethnicity, and race. Encouraging public participation in the urban planning process is one of the instruments that cities can utilise to achieve better social inclusion outcomes. This paper explores how public participation can be used as a social inclusion tool in the urban planning process to achieve better outcomes in Abuja urban planning system. The purpose of this study is to investigate the effectiveness of this approach. Also, a conceptual model was developed which evaluates the relationship between public participation and social inclusion outcomes in the urban planning process. It was seen that every community has its peculiar way of life and challenges, and an understanding of these social societal needs is paramount in the urban planning process. Therefore, the involvement of the public in identifying their needs, selecting priorities and identifying strategies offer better chances for developing solutions that are sustainable, feasible and implementable.Keywords: public participation, social inclusion, urban planning, urban planning process
Procedia PDF Downloads 20014877 Difficulties Encountered in the Process of Supporting Reading Skills of a Student with Hearing Loss Whose Inclusion Was Ongoing and Solution Proposals
Authors: Ezgi Tozak, H. Pelin Karasu, Umit Girgin
Abstract:
In this study, difficulties encountered in the process of supporting the reading skills of a student with hearing loss whose inclusion was ongoing and the solutions improved during the practice process were examined. The study design was action research. Participants of this study, which was conducted between the dates of 29 September 2016 and 22 February 2017, consisted of a student with hearing loss, a classroom teacher, a teacher in the rehabilitation center, researcher/teacher and validity committee members. The data were obtained through observations, validity committee meeting, interviews, documents, and the researcher diary. Research findings show that in the process of supporting reading skills of the student with hearing loss, the student's knowledge of concepts was limited, and the student had difficulties in feeling and identification of sounds, reading and understanding words-sentences and retelling what he/she listened to. With the purpose of overcoming these difficulties in the implementation process, activities were prepared towards concepts, sound education, reading and understanding words and sentences, and retelling what you listen to; these activities were supported with visual materials and real objects and repeated with diversities.Keywords: inclusion, reading process, supportive education, student with hearing loss
Procedia PDF Downloads 14714876 Online Monitoring Rheological Property of Polymer Melt during Injection Molding
Authors: Chung-Chih Lin, Chien-Liang Wu
Abstract:
The detection of the polymer melt state during manufacture process is regarded as an efficient way to control the molded part quality in advance. Online monitoring rheological property of polymer melt during processing procedure provides an approach to understand the melt state immediately. Rheological property reflects the polymer melt state at different processing parameters and is very important in injection molding process especially. An approach that demonstrates how to calculate rheological property of polymer melt through in-process measurement, using injection molding as an example, is proposed in this study. The system consists of two sensors and a data acquisition module can process the measured data, which are used for the calculation of rheological properties of polymer melt. The rheological properties of polymer melt discussed in this study include shear rate and viscosity which are investigated with respect to injection speed and melt temperature. The results show that the effect of injection speed on the rheological properties is apparent, especially for high melt temperature and should be considered for precision molding process.Keywords: injection molding, melt viscosity, shear rate, monitoring
Procedia PDF Downloads 38114875 CMMI Key Process Areas and FDD Practices
Authors: Rituraj Deka, Nomi Baruah
Abstract:
The development of information technology during the past few years resulted in designing of more and more complex software. The outsourcing of software development makes a higher requirement for the management of software development project. Various software enterprises follow various paths in their pursuit of excellence, applying various principles, methods and techniques along the way. The new research is proving that CMMI and Agile methodologies can benefit from using both methods within organizations with the potential to dramatically improve business performance. The paper describes a mapping between CMMI key process areas (KPAs) and Feature-Driven Development (FDD) communication perspective, so as to increase the understanding of how improvements can be made in the software development process.Keywords: Agile, CMMI, FDD, KPAs
Procedia PDF Downloads 45814874 Interior Design Pedagogy in the 21st Century: Personalised Design Process
Authors: Roba Zakariah Shaheen
Abstract:
In the 21st-century Interior, design pedagogy has developed rapidly due to social and economical factors. Socially, this paper presents research findings that shows a significant relationship between educators and students in interior design education. It shows that students’ personal traits, design process, and thinking process are significantly interrelated. Constructively, this paper presented how personal traits can guide educators in the interior design education domain to develop students’ thinking process. In the same time, it demonstrated how students should use their own personal traits to create their own design process. Constructivism was the theory underneath this research, as it supports the grounded theory, which is the methodological approach of this research. Moreover, Mayer’s Briggs Type Indicator strategy was used to investigate the personality traits scientifically, as a psychological strategy that related to cognitive ability. Conclusions from this research strongly recommends that educators and students should utilize their personal traits to foster interior design education.Keywords: interior design, pedagogy, constructivism, grounded theory, personality traits, creativity
Procedia PDF Downloads 20714873 Active Contours for Image Segmentation Based on Complex Domain Approach
Authors: Sajid Hussain
Abstract:
The complex domain approach for image segmentation based on active contour has been designed, which deforms step by step to partition an image into numerous expedient regions. A novel region-based trigonometric complex pressure force function is proposed, which propagates around the region of interest using image forces. The signed trigonometric force function controls the propagation of the active contour and the active contour stops on the exact edges of the object accurately. The proposed model makes the level set function binary and uses Gaussian smoothing kernel to adjust and escape the re-initialization procedure. The working principle of the proposed model is as follows: The real image data is transformed into complex data by iota (i) times of image data and the average iota (i) times of horizontal and vertical components of the gradient of image data is inserted in the proposed model to catch complex gradient of the image data. A simple finite difference mathematical technique has been used to implement the proposed model. The efficiency and robustness of the proposed model have been verified and compared with other state-of-the-art models.Keywords: image segmentation, active contour, level set, Mumford and Shah model
Procedia PDF Downloads 11414872 Transfer Knowledge From Multiple Source Problems to a Target Problem in Genetic Algorithm
Authors: Terence Soule, Tami Al Ghamdi
Abstract:
To study how to transfer knowledge from multiple source problems to the target problem, we modeled the Transfer Learning (TL) process using Genetic Algorithms as the model solver. TL is the process that aims to transfer learned data from one problem to another problem. The TL process aims to help Machine Learning (ML) algorithms find a solution to the problems. The Genetic Algorithms (GA) give researchers access to information that we have about how the old problem is solved. In this paper, we have five different source problems, and we transfer the knowledge to the target problem. We studied different scenarios of the target problem. The results showed combined knowledge from multiple source problems improves the GA performance. Also, the process of combining knowledge from several problems results in promoting diversity of the transferred population.Keywords: transfer learning, genetic algorithm, evolutionary computation, source and target
Procedia PDF Downloads 14014871 Residual Life Estimation Based on Multi-Phase Nonlinear Wiener Process
Authors: Hao Chen, Bo Guo, Ping Jiang
Abstract:
Residual life (RL) estimation based on multi-phase nonlinear Wiener process was studied in this paper, which is significant for complicated products with small samples. Firstly, nonlinear Wiener model with random parameter was introduced and multi-phase nonlinear Wiener model was proposed to model degradation process of products that were nonlinear and separated into different phases. Then the multi-phase RL probability density function based on the presented model was derived approximately in a closed form and parameters estimation was achieved with the method of maximum likelihood estimation (MLE). Finally, the method was applied to estimate the RL of high voltage plus capacitor. Compared with the other three different models by log-likelihood function (Log-LF) and Akaike information criterion (AIC), the results show that the proposed degradation model can capture degradation process of high voltage plus capacitors in a better way and provide a more reliable result.Keywords: multi-phase nonlinear wiener process, residual life estimation, maximum likelihood estimation, high voltage plus capacitor
Procedia PDF Downloads 45314870 Techno-Economic Assessment of Aluminum Waste Management
Authors: Hamad Almohamadi, Abdulrahman AlKassem, Majed Alamoudi
Abstract:
Dumping Aluminum (Al) waste into landfills causes several health and environmental problems. The pyrolysis process could treat Al waste to produce AlCl₃ and H₂. Using the Aspen Plus software, a techno-economic and feasibility assessment has been performed for Al waste pyrolysis. The Aspen Plus simulation was employed to estimate the plant's mass and energy balance, which was assumed to process 100 dry metric tons of Al waste per day. This study looked at two cases of Al waste treatment. The first case produces 355 tons of AlCl₃ per day and 9 tons of H₂ per day without recycling. The conversion rate must be greater than 50% in case 1 to make a profit. In this case, the MSP for AlCl₃ is $768/ton. The plant would generate $25 million annually if the AlCl₃ were sold at $1000 per ton. In case 2 with recycling, the conversion has less impact on the plant's profitability than in case 1. Moreover, compared to case 1, the MSP of AlCl₃ has no significant influence on process profitability. In this scenario, if AlCl₃ were sold at $1000/ton, the process profit would be $58 million annually. Case 2 is better than case 1 because recycling Al generates a higher yield than converting it to AlCl₃ and H₂.Keywords: aluminum waste, aspen plus, process modelling, fast pyrolysis, techno-economic assessment
Procedia PDF Downloads 9314869 The Effect of Tacit Knowledge for Intelligence Cycle
Authors: Bahadir Aydin
Abstract:
It is difficult to access accurate knowledge because of mass data. This huge data make environment more and more caotic. Data are main piller of intelligence. The affiliation between intelligence and knowledge is quite significant to understand underlying truths. The data gathered from different sources can be modified, interpreted and classified by using intelligence cycle process. This process is applied in order to progress to wisdom as well as intelligence. Within this process the effect of tacit knowledge is crucial. Knowledge which is classified as explicit and tacit knowledge is the key element for any purpose. Tacit knowledge can be seen as "the tip of the iceberg”. This tacit knowledge accounts for much more than we guess in all intelligence cycle. If the concept of intelligence cycle is scrutinized, it can be seen that it contains risks, threats as well as success. The main purpose of all organizations is to be successful by eliminating risks and threats. Therefore, there is a need to connect or fuse existing information and the processes which can be used to develop it. Thanks to this process the decision-makers can be presented with a clear holistic understanding, as early as possible in the decision making process. Altering from the current traditional reactive approach to a proactive intelligence cycle approach would reduce extensive duplication of work in the organization. Applying new result-oriented cycle and tacit knowledge intelligence can be procured and utilized more effectively and timely.Keywords: information, intelligence cycle, knowledge, tacit Knowledge
Procedia PDF Downloads 51414868 Taguchi-Based Surface Roughness Optimization for Slotted and Tapered Cylindrical Products in Milling and Turning Operations
Authors: Vineeth G. Kuriakose, Joseph C. Chen, Ye Li
Abstract:
The research follows a systematic approach to optimize the parameters for parts machined by turning and milling processes. The quality characteristic chosen is surface roughness since the surface finish plays an important role for parts that require surface contact. A tapered cylindrical surface is designed as a test specimen for the research. The material chosen for machining is aluminum alloy 6061 due to its wide variety of industrial and engineering applications. HAAS VF-2 TR computer numerical control (CNC) vertical machining center is used for milling and HAAS ST-20 CNC machine is used for turning in this research. Taguchi analysis is used to optimize the surface roughness of the machined parts. The L9 Orthogonal Array is designed for four controllable factors with three different levels each, resulting in 18 experimental runs. Signal to Noise (S/N) Ratio is calculated for achieving the specific target value of 75 ± 15 µin. The controllable parameters chosen for turning process are feed rate, depth of cut, coolant flow and finish cut and for milling process are feed rate, spindle speed, step over and coolant flow. The uncontrollable factors are tool geometry for turning process and tool material for milling process. Hypothesis testing is conducted to study the significance of different uncontrollable factors on the surface roughnesses. The optimal parameter settings were identified from the Taguchi analysis and the process capability Cp and the process capability index Cpk were improved from 1.76 and 0.02 to 3.70 and 2.10 respectively for turning process and from 0.87 and 0.19 to 3.85 and 2.70 respectively for the milling process. The surface roughnesses were improved from 60.17 µin to 68.50 µin, reducing the defect rate from 52.39% to 0% for the turning process and from 93.18 µin to 79.49 µin, reducing the defect rate from 71.23% to 0% for the milling process. The purpose of this study is to efficiently utilize the Taguchi design analysis to improve the surface roughness.Keywords: surface roughness, Taguchi parameter design, CNC turning, CNC milling
Procedia PDF Downloads 15514867 Motion-Based Detection and Tracking of Multiple Pedestrians
Authors: A. Harras, A. Tsuji, K. Terada
Abstract:
Tracking of moving people has gained a matter of great importance due to rapid technological advancements in the field of computer vision. The objective of this study is to design a motion based detection and tracking multiple walking pedestrians randomly in different directions. In our proposed method, Gaussian mixture model (GMM) is used to determine moving persons in image sequences. It reacts to changes that take place in the scene like different illumination; moving objects start and stop often, etc. Background noise in the scene is eliminated through applying morphological operations and the motions of tracked people which is determined by using the Kalman filter. The Kalman filter is applied to predict the tracked location in each frame and to determine the likelihood of each detection. We used a benchmark data set for the evaluation based on a side wall stationary camera. The actual scenes from the data set are taken on a street including up to eight people in front of the camera in different two scenes, the duration is 53 and 35 seconds, respectively. In the case of walking pedestrians in close proximity, the proposed method has achieved the detection ratio of 87%, and the tracking ratio is 77 % successfully. When they are deferred from each other, the detection ratio is increased to 90% and the tracking ratio is also increased to 79%.Keywords: automatic detection, tracking, pedestrians, counting
Procedia PDF Downloads 257