Search results for: selenization process.
4307 Experimental Analysis of Diesel Hydrotreating Reactor to Development a Simplified Tool for Process Real- time Optimization
Authors: S.Shokri, S.Zahedi, M.Ahmadi Marvast, B. Baloochi, H.Ganji
Abstract:
In this research, a systematic investigation was carried out to determine the optimum conditions of HDS reactor. Moreover, a suitable model was developed for a rigorous RTO (real time optimization) loop of HDS (Hydro desulfurization) process. A systematic experimental series was designed based on CCD (Central Composite design) and carried out in the related pilot plant to tune the develop model. The designed variables in the experiments were Temperature, LHSV and pressure. However, the hydrogen over fresh feed ratio was remained constant. The ranges of these variables were respectively equal to 320-380ºC, 1- 21/hr and 50-55 bar. a power law kinetic model was also developed for our further research in the future .The rate order and activation energy , power of reactant concentration and frequency factor of this model was respectively equal to 1.4, 92.66 kJ/mol and k0=2.7*109 .
Keywords: Statistical model, Multiphase Reactors, Gas oil, Hydrodesulfurization, Optimization, Kinetics
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26864306 Statistically Significant Differences of Carbon Dioxide and Carbon Monoxide Emission in Photocopying Process
Authors: Kiurski S. Jelena, Kecić S. Vesna, Oros B. Ivana
Abstract:
Experimental results confirmed the temporal variation of carbon dioxide and carbon monoxide concentration during the working shift of the photocopying process in a small photocopying shop in Novi Sad, Serbia. The statistically significant differences of target gases were examined with two-way analysis of variance without replication followed by Scheffe's post hoc test. The existence of statistically significant differences was obtained for carbon monoxide emission which is pointed out with F-values (12.37 and 31.88) greater than Fcrit (6.94) in contrary to carbon dioxide emission (F-values of 1.23 and 3.12 were less than Fcrit). Scheffe's post hoc test indicated that sampling point A (near the photocopier machine) and second time interval contribute the most on carbon monoxide emission.Keywords: Analysis of variance, carbon dioxide, carbon monoxide, photocopying indoor, Scheffe's test
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15994305 The Impact of 21st Century Technology in Higher Education: The Role of Artificial Intelligence
Authors: Josefina Bengoechea, Alex Bell
Abstract:
Higher education, with its brick-and-mortar facilities and credits-based on hours of study, was developed to serve the needs of a national, industrial, analogue economy. However, the ongoing process of globalization on the one hand, and the emergence of ever-changing needs of employers on the other hand, make this type of process-based education obsolete, and exclusive to students who can afford to pay a full-time tuition and dedicate 4 years of their lives exclusively to study. The creative destruction brought about by new technologies in the 21st century will not only reconfigure the labour market, as millions of jobs will be lost to Artificial Intelligence. The purpose of this paper is to consider if the implementation of technology is the solution to the problems faced in higher education. The paper builds upon a constructivist approach, combining a literature review and research on key publications.
Keywords: Artificial intelligence, employability, labour market, new technology in higher education.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7354304 Improvement of Ground Truth Data for Eye Location on Infrared Driver Recordings
Authors: Sorin Valcan, Mihail Găianu
Abstract:
Labeling is a very costly and time consuming process which aims to generate datasets for training neural networks in several functionalities and projects. For driver monitoring system projects, the need of labeled images has a significant impact on the budget and distribution of effort. This paper presents the modifications done to a ground truth data generation algorithm for 2D eyes location on infrared images with drivers in order to improve the quality of the data and performance of the trained neural networks. The algorithm restrictions become tougher which makes it more accurate but also less constant. The resulting dataset becomes smaller and shall not be altered by any kind of manual labels adjustment before being used in the neural networks training process. These changes resulted in a much better performance of the trained neural networks.
Keywords: Labeling automation, infrared camera, driver monitoring, eye detection, Convolutional Neural Networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4204303 Neighborhood Sustainability Assessment Tools: A Conceptual Framework for Their Use in Building Adaptive Capacity to Climate Change
Authors: Sally Naji, Julie Gwilliam
Abstract:
Climate change remains a challenging matter for the human and the built environment in the 21st century, where the need to consider adaptation to climate change in the development process is paramount. However, there remains a lack of information regarding how we should prepare responses to this issue, such as through developing organized and sophisticated tools enabling the adaptation process. This study aims to build a systematic framework approach to investigate the potentials that Neighborhood Sustainability Assessment tools (NSA) might offer in enabling both the analysis of the emerging adaptive capacity to climate change. The analysis of the framework presented in this paper aims to discuss this issue in three main phases. The first part attempts to link sustainability and climate change, in the context of adaptive capacity. It is argued that in deciding to promote sustainability in the context of climate change, both the resilience and vulnerability processes become central. However, there is still a gap in the current literature regarding how the sustainable development process can respond to climate change. As well as how the resilience of practical strategies might be evaluated. It is suggested that the integration of the sustainability assessment processes with both the resilience thinking process, and vulnerability might provide important components for addressing the adaptive capacity to climate change. A critical review of existing literature is presented illustrating the current lack of work in this field, integrating these three concepts in the context of addressing the adaptive capacity to climate change. The second part aims to identify the most appropriate scale at which to address the built environment for the climate change adaptation. It is suggested that the neighborhood scale can be considered as more suitable than either the building or urban scales. It then presents the example of NSAs, and discusses the need to explore their potential role in promoting the adaptive capacity to climate change. The third part of the framework presents a comparison among three example NSAs, BREEAM Communities, LEED-ND, and CASBEE-UD. These three tools have been selected as the most developed and comprehensive assessment tools that are currently available for the neighborhood scale. This study concludes that NSAs are likely to present the basis for an organized framework to address the practical process for analyzing and yet promoting Adaptive Capacity to Climate Change. It is further argued that vulnerability (exposure & sensitivity) and resilience (Interdependence & Recovery) form essential aspects to be addressed in the future assessment of NSA’s capability to adapt to both short and long term climate change impacts. Finally, it is acknowledged that further work is now required to understand impact assessment in terms of the range of physical sectors (Water, Energy, Transportation, Building, Land Use and Ecosystems), Actor and stakeholder engagement as well as a detailed evaluation of the NSA indicators, together with a barriers diagnosis process.Keywords: Adaptive capacity, climate change, NSA tools, resilience, vulnerability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21724302 The Influence of Pad Thermal Diffusivity over Heat Transfer into the PCBs Structure
Authors: Mihai Brânzei, Ioan Plotog, Ion Pencea
Abstract:
The Pads have unique values of thermophysical properties (THP) having important contribution over heat transfer into the PCB structure. Materials with high thermal diffusivity (TD) rapidly adjust their temperature to that of their surroundings, because the HT is quick in compare to their volumetric heat capacity (VHC). In the paper is presenting the diffusivity tests (ASTM E1461 flash method) for PCBs with different core materials. In the experiments, the multilayer structure of PCBA was taken into consideration, an equivalent property referring to each of experimental structure be practically measured. Concerning to entire structure, the THP emphasize the major contribution of substrate in establishing of reflow soldering process (RSP) heat transfer necessities. This conclusion offer practical solution for heat transfer time constant calculation as function of thickness and substrate material diffusivity with an acceptable error estimation.Keywords: heat transfer time constant, packaging, reflowsoldering process, thermal diffusivity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23304301 Aircraft Selection Using Preference Optimization Programming (POP)
Authors: C. Ardil
Abstract:
A multiple-criteria decision support system is proposed for the best aircraft selection decision. Various strategic, economic, environmental, and risk-related factors can directly or indirectly influence this choice, and they should be taken into account in the decision-making process. The paper suggests a multiple-criteria analysis to aid in the airline management's decision-making process when choosing an appropriate aircraft. In terms of the suggested approach, an integrated entropic preference optimization programming (POP) for fleet modeling risk analysis is applied. The findings of the study of multiple criteria analysis indicate that the A321(neo) aircraft type is the best alternative in this particular optimization instance. The proposed methodology can be applied to other complex engineering problems involving multiple criteria analysis.
Keywords: Aircraft selection, decision making, multiple criteria decision making, preference optimization programming, POP, entropic weight method, TOPSIS, WSM, WPM
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6264300 Optimizing Approach for Sifting Process to Solve a Common Type of Empirical Mode Decomposition Mode Mixing
Authors: Saad Al-Baddai, Karema Al-Subari, Elmar Lang, Bernd Ludwig
Abstract:
Empirical mode decomposition (EMD), a new data-driven of time-series decomposition, has the advantage of supposing that a time series is non-linear or non-stationary, as is implicitly achieved in Fourier decomposition. However, the EMD suffers of mode mixing problem in some cases. The aim of this paper is to present a solution for a common type of signals causing of EMD mode mixing problem, in case a signal suffers of an intermittency. By an artificial example, the solution shows superior performance in terms of cope EMD mode mixing problem comparing with the conventional EMD and Ensemble Empirical Mode decomposition (EEMD). Furthermore, the over-sifting problem is also completely avoided; and computation load is reduced roughly six times compared with EEMD, an ensemble number of 50.Keywords: Empirical mode decomposition, mode mixing, sifting process, over-sifting.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9924299 Faults Forecasting System
Authors: Hanaa E.Sayed, Hossam A. Gabbar, Shigeji Miyazaki
Abstract:
This paper presents Faults Forecasting System (FFS) that utilizes statistical forecasting techniques in analyzing process variables data in order to forecast faults occurrences. FFS is proposing new idea in detecting faults. Current techniques used in faults detection are based on analyzing the current status of the system variables in order to check if the current status is fault or not. FFS is using forecasting techniques to predict future timing for faults before it happens. Proposed model is applying subset modeling strategy and Bayesian approach in order to decrease dimensionality of the process variables and improve faults forecasting accuracy. A practical experiment, designed and implemented in Okayama University, Japan, is implemented, and the comparison shows that our proposed model is showing high forecasting accuracy and BEFORE-TIME.Keywords: Bayesian Techniques, Faults Detection, Forecasting techniques, Multivariate Analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15524298 Teachers' Conceptions as a Basis for the Design of an Educational Application: Case Perioperative Nursing
Authors: Antti Pirhonen, Minna Silvennoinen
Abstract:
The only relevant basis for the design of an educational application are objectives of learning for the content area. This study analyses the process in which the real – not only the formal – objectives could work as the starting point for the construction of an educational game. The application context is the education of perioperative nursing. The process is based on the panel discussions of nursing teachers. In the panels, the teachers elaborated the objectives. The transcribed discussions were analysed in terms of the conceptions of learning and teaching of perioperative nursing. The outcome of the study is first the elaborated objectives, which will be used in the implementation of an educational game for the needs of pre-, intra and post-operative nursing skills learning. Second, the study shows that different views of learning are necessary to be understood in order to design an appropriate educational application.
Keywords: Perioperative nursing, conceptions of learning, educational applications.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16204297 Collaborative Research between Malaysian and Australian Universities on Learning Analytics: Challenges and Strategies
Authors: Z. Tasir, S. N. Kew, D. West, Z. Abdullah, D. Toohey
Abstract:
Research on Learning Analytics is progressively developing in the higher education field by concentrating on the process of students' learning. Therefore, a research project between Malaysian and Australian Universities was initiated in 2015 to look at the use of Learning Analytics to support the development of teaching practice. The focal point of this article is to discuss and share the experiences of Malaysian and Australian universities in the process of developing the collaborative research on Learning Analytics. Three aspects of this will be discussed: 1) Establishing an international research project and team members, 2) cross-cultural understandings, and 3) ways of working in relation to the practicalities of the project. This article is intended to benefit other researchers by highlighting the challenges as well as the strategies used in this project to ensure such collaborative research succeeds.Keywords: Academic research project, collaborative research, cross-cultural understanding, international research project.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14864296 Managing User Expectations in Information Systems Development
Authors: Linda, Sau-ling Lai
Abstract:
This paper provides new ways to explore the old problem of failure of information systems development in an organisation. Based on the theory of cognitive dissonance, information systems (IS) failure is defined as a gap between what the users expect from an information system and how well these expectations are met by the perceived performance of the delivered system. Bridging the expectation-perception gap requires that IS professionals make a radical change from being the proprietor of information systems and products to being service providers. In order to deliver systems and services that IS users perceive as valuable, IS people must become expert in determining and assessing users- expectations and perceptions. It is also suggested that the IS community, in general, has given relatively little attention to the front-end process of requirements specification for IS development. There is a simplistic belief that requirements are obtainable from users, they are then translatable into a formal specification. The process of information needs analysis is problematic and worthy of investigation.Keywords: Information Systems Development, Cognitive Dissonance, Expectation-Perception Gap, Requirements Analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 36554295 Construction Technology of Modified Vacuum Pre-Loading Method for Slurry Dredged Soil
Authors: Ali H. Mahfouz, Gao Ming-Jun, Mohamad Sharif
Abstract:
Slurry dredged soil at coastal area has a high water content, poor permeability, and low surface intensity. Hence, it is infeasible to use vacuum preloading method to treat this type of soil foundation. For the special case of super soft ground, a floating bridge is first constructed on muddy soil and used as a service road and platform for implementing the modified vacuum preloading method. The modified technique of vacuum preloading and its construction process for the super soft soil foundation improvement is then studied. Application of modified vacuum preloading method shows that the technology and its construction process are highly suitable for improving the super soft soil foundation in coastal areas.
Keywords: Super soft foundation, dredger fill, vacuum preloading, foundation treatment, construction technology.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19224294 Application of Recycled Tungsten Carbide Powder for Fabrication of Iron Based Powder Metallurgy Alloy
Authors: Yukinori Taniguchi, Kazuyoshi Kurita, Kohei Mizuta, Keigo Nishitani, Ryuichi Fukuda
Abstract:
Tungsten carbide is widely used as a tool material in metal manufacturing process. Since tungsten is typical rare metal, establishment of recycle process of tungsten carbide tools and restore into cemented carbide material bring great impact to metal manufacturing industry. Recently, recycle process of tungsten carbide has been developed and established gradually. However, the demands for quality of cemented carbide tool are quite severe because hardness, toughness, anti-wear ability, heat resistance, fatigue strength and so on should be guaranteed for precision machining and tool life. Currently, it is hard to restore the recycled tungsten carbide powder entirely as raw material for new processed cemented carbide tool. In this study, to suggest positive use of recycled tungsten carbide powder, we have tried to fabricate a carbon based sintered steel which shows reinforced mechanical properties with recycled tungsten carbide powder. We have made set of newly designed sintered steels. Compression test of sintered specimen in density ratio of 0.85 (which means 15% porosity inside) has been conducted. As results, at least 1.7 times higher in nominal strength in the amount of 7.0 wt.% was shown in recycled WC powder. The strength reached to over 600 MPa for the Fe-WC-Co-Cu sintered alloy. Wear test has been conducted by using ball-on-disk type friction tester using 5 mm diameter ball with normal force of 2 N in the dry conditions. Wear amount after 1,000 m running distance shows that about 1.5 times longer life was shown in designed sintered alloy. Since results of tensile test showed that same tendency in previous testing, it is concluded that designed sintered alloy can be used for several mechanical parts with special strength and anti-wear ability in relatively low cost due to recycled tungsten carbide powder.Keywords: Tungsten carbide, recycle process, compression test, powder metallurgy, anti-wear ability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14774293 Refining Waste Spent Hydroprocessing Catalyst and Their Metal Recovery
Authors: Meena Marafi, Mohan S. Rana
Abstract:
Catalysts play an important role in producing valuable fuel products in petroleum refining; but, due to feedstock’s impurities catalyst gets deactivated with carbon and metal deposition. The disposal of spent catalyst falls under the category of hazardous industrial waste that requires strict agreement with environmental regulations. The spent hydroprocessing catalyst contains Mo, V and Ni at high concentrations that have been found to be economically significant for recovery. Metal recovery process includes deoiling, decoking, grinding, dissolving and treatment with complexing leaching agent such as ethylene diamine tetra acetic acid (EDTA). The process conditions have been optimized as a function of time, temperature and EDTA concentration in presence of ultrasonic agitation. The results indicated that optimum condition established through this approach could recover 97%, 94% and 95% of the extracted Mo, V and Ni, respectively, while 95% EDTA was recovered after acid treatment.
Keywords: Spent catalyst, deactivation, hydrotreating, spent catalyst.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13244292 Multi-objective Optimisation of Composite Laminates under Heat and Moisture Effects using a Hybrid Neuro-GA Algorithm
Authors: M. R. Ghasemi, A. Ehsani
Abstract:
In this paper, the optimum weight and cost of a laminated composite plate is seeked, while it undergoes the heaviest load prior to a complete failure. Various failure criteria are defined for such structures in the literature. In this work, the Tsai-Hill theory is used as the failure criterion. The theory of analysis was based on the Classical Lamination Theory (CLT). A newly type of Genetic Algorithm (GA) as an optimization technique with a direct use of real variables was employed. Yet, since the optimization via GAs is a long process, and the major time is consumed through the analysis, Radial Basis Function Neural Networks (RBFNN) was employed in predicting the output from the analysis. Thus, the process of optimization will be carried out through a hybrid neuro-GA environment, and the procedure will be carried out until a predicted optimum solution is achieved.Keywords: Composite Laminates, GA, Multi-objectiveOptimisation, Neural Networks, RBFNN.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16314291 An Energy Integration Approach on UHDE Ammonia Process
Authors: Alnouss M. Ahmed, Al-Nuaimi A. Ibrahim
Abstract:
In this paper, the energy performance of a selected UHDE Ammonia plant is optimized by conducting heat integration through waste heat recovery and the synthesis of a heat exchange network (HEN). Minimum hot and cold utility requirements were estimated through IChemE spreadsheet. Supporting simulation was carried out using HYSYS software. The results showed that there is no need for heating utility while the required cold utility was found to be around 268,714 kW. Hence a threshold pinch case was faced. Then, the hot and cold streams were matched appropriately. Also, waste heat recovered resulted with savings in HP and LP steams of approximately 51.0% and 99.6%, respectively. An economic analysis on proposed HEN showed very attractive overall payback period not exceeding 3 years. In general, a net saving approaching 35% was achieved in implementing heat optimization of current studied UHDE Ammonia process.Keywords: Ammonia, Energy Optimization, Heat Exchange Network and Techno-Economic Analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 45544290 An Automated Method to Segment and Classify Masses in Mammograms
Authors: Viet Dzung Nguyen, Duc Thuan Nguyen, Tien Dzung Nguyen, Van Thanh Pham
Abstract:
Mammography is the most effective procedure for an early diagnosis of the breast cancer. Nowadays, people are trying to find a way or method to support as much as possible to the radiologists in diagnosis process. The most popular way is now being developed is using Computer-Aided Detection (CAD) system to process the digital mammograms and prompt the suspicious region to radiologist. In this paper, an automated CAD system for detection and classification of massive lesions in mammographic images is presented. The system consists of three processing steps: Regions-Of- Interest detection, feature extraction and classification. Our CAD system was evaluated on Mini-MIAS database consisting 322 digitalized mammograms. The CAD system-s performance is evaluated using Receiver Operating Characteristics (ROC) and Freeresponse ROC (FROC) curves. The archived results are 3.47 false positives per image (FPpI) and sensitivity of 85%.Keywords: classification, computer-aided detection, featureextraction, mass detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16574289 A Goal-Driven Crime Scripting Framework
Authors: Hashem Dehghanniri
Abstract:
Crime scripting is a simple and effective crime modeling technique that aims to improve understanding of security analysts about security and crime incidents. Low-quality scripts provide a wrong, incomplete, or sophisticated understanding of the crime commission process, which oppose the purpose of their application, e.g., identifying effective and cost-efficient situational crime prevention (SCP) measures. One important and overlooked factor in generating quality scripts is the crime scripting method. This study investigates the problems within the existing crime scripting practices and proposes a crime scripting approach that contributes to generating quality crime scripts. It was validated by experienced crime scripters. This framework helps analysts develop better crime scripts and contributes to their effective application, e.g., SCP measures identification or policy-making.
Keywords: Attack modeling, crime commission process, crime script, situational crime prevention.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7114288 Degradation of Amitriptyline Hydrochloride, Methyl Salicylate and 2-Phenoxyethanol in Water Systems by the Combination UV/Cl2
Authors: F. Javier Benitez, Francisco J. Real, Juan Luis Acero, Francisco Casas
Abstract:
Three emerging contaminants (amitriptyline hydrochloride, methyl salicylate and 2-phenoxyethanol) frequently found in waste-waters were selected to be individually degraded in ultra-pure water by the combined advanced oxidation process constituted by UV radiation and chlorine. The influence of pH, initial chlorine concentration and nature of the contaminants was firstly explored. The trend for the reactivity of the selected compounds was deduced: amitriptyline hydrochloride > methyl salicylate > 2-phenoxyethanol. A later kinetic study was carried out and focused on the specific evaluation of the first-order rate constants and the determination of the partial contribution to the global reaction of the direct photochemical pathway and the radical pathway. A comparison between the rate constant values among photochemical experiments without and with the presence of Cl2 reveals a clear increase in the oxidation efficiency of the combined process with respect to the photochemical reaction alone. In a second stage, the simultaneous oxidation of mixtures of the selected contaminants in several types of water (ultrapure water, surface water from a reservoir, and two secondary effluents) was also performed by the same combination UV/Cl2 under more realistic operating conditions. The efficiency of this combined system UV/Cl2 was compared to other oxidants such as the UV/S2O82- and UV/H2O2 AOPs. Results confirmed that the UV/Cl2 system provides higher elimination efficiencies among the AOPs tested.
Keywords: Emerging contaminants, amitriptyline, methyl salicylate, 2-phenoxyethanol, chlorination, photolysis, rate constants, UV/chlorine advanced oxidation process.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15544287 Land Surface Temperature and Biophysical Factors in Urban Planning
Authors: Illyani Ibrahim, Azizan Abu Samah, Rosmadi Fauzi
Abstract:
Land surface temperature (LST) is an important parameter to study in urban climate. The understanding of the influence of biophysical factors could improve the establishment of modeling urban thermal landscape. It is well established that climate hold a great influence on the urban landscape. However, it has been recognize that climate has a low priority in urban planning process, due to the complex nature of its influence. This study will focus on the relatively cloud free Landsat Thematic Mapper image of the study area, acquired on the 2nd March 2006. Correlation analyses were conducted to identify the relationship of LST to the biophysical factors; vegetation indices, impervious surface, and albedo to investigate the variation of LST. We suggest that the results can be considered by the stackholders during decision-making process to create a cooler and comfortable environment in the urban landscape for city dwellers.Keywords: Biophysical factors, land surface temperature, urban planning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20874286 Statistical Process Optimization Through Multi-Response Surface Methodology
Authors: S. Raissi, R- Eslami Farsani
Abstract:
In recent years, response surface methodology (RSM) has brought many attentions of many quality engineers in different industries. Most of the published literature on robust design methodology is basically concerned with optimization of a single response or quality characteristic which is often most critical to consumers. For most products, however, quality is multidimensional, so it is common to observe multiple responses in an experimental situation. Through this paper interested person will be familiarize with this methodology via surveying of the most cited technical papers. It is believed that the proposed procedure in this study can resolve a complex parameter design problem with more than two responses. It can be applied to those areas where there are large data sets and a number of responses are to be optimized simultaneously. In addition, the proposed procedure is relatively simple and can be implemented easily by using ready-made standard statistical packages.Keywords: Multi-Response Surface Methodology (MRSM), Design of Experiments (DOE), Process modeling, Quality improvement; Robust Design.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 44594285 The Fake News Impact on the Public Policy Cycle: A Systemic Analysis through Documentary Survey
Authors: Aron Miranda Burgos, Ergon Cugler de Moraes Silva
Abstract:
In the present article, it is observed that the constant advancement of issues related to misinformation impacts the guarantee of the public policy cycle. Thus, it is found that the dissemination of false information has a direct influence on each of the component stages of this cycle. Therefore, in order to maintain scientific and theoretical credibility in the qualitative analysis process, it was necessary to logically interpose the concepts of firehosing of falsehood, fake news, public policy cycle, as well as using the epistemological and pragmatic mechanism at the intersection of such academic concepts, such as the scientific method. It was found, through the analysis of official documents and public notes, how the multiple theoretical perspectives evidence the commitment of the provision and elaboration of public policies, verifying the way in which the fake news impact each part of the process in this atmosphere.
Keywords: Firehosing of falsehood, governance, misinformation, post-truth.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8694284 Treatment of Tannery Effluents by the Process of Coagulation
Authors: G. Shegani
Abstract:
Coagulation is a process that sanitizes leather effluents. It aims to reduce pollutants such as Chemical Oxygen Demand (COD), chloride, sulfate, chromium, suspended solids, and other dissolved solids. The current study aimed to evaluate coagulation efficiency of tannery wastewater by analyzing the change in organic matter, odor, color, ammonium ions, nutrients, chloride, H2S, sulfate, suspended solids, total dissolved solids, fecal pollution, and chromium hexavalent before and after treatment. Effluent samples were treated with coagulants Ca(OH)2 and FeSO4 .7H2O. The best advantages of this treatment included the removal of: COD (81.60%); ammonia ions (98.34%); nitrate ions (92%); chromium hexavalent (75.00%); phosphate (70.00%); chloride (69.20%); and H₂S (50%). Results also indicated a high level of efficiency in the reduction of fecal pollution indicators. Unfortunately, only a modest reduction of sulfate (19.00%) and TSS (13.00%) and an increase in TDS (15.60%) was observed.
Keywords: Coagulation, Effluent, Tannery, Treatment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 41624283 Level Set and Morphological Operation Techniques in Application of Dental Image Segmentation
Authors: Abdolvahab Ehsani Rad, Mohd Shafry Mohd Rahim, Alireza Norouzi
Abstract:
Medical image analysis is one of the great effects of computer image processing. There are several processes to analysis the medical images which the segmentation process is one of the challenging and most important step. In this paper the segmentation method proposed in order to segment the dental radiograph images. Thresholding method has been applied to simplify the images and to morphologically open binary image technique performed to eliminate the unnecessary regions on images. Furthermore, horizontal and vertical integral projection techniques used to extract the each individual tooth from radiograph images. Segmentation process has been done by applying the level set method on each extracted images. Nevertheless, the experiments results by 90% accuracy demonstrate that proposed method achieves high accuracy and promising result.
Keywords: Integral production, level set method, morphological operation, segmentation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 42324282 PetriNets Manipulation to Reduce Roaming Duration: Criterion to Improve Handoff Management
Authors: Hossam el-ddin Mostafa, Pavel Čičak
Abstract:
IETF RFC 2002 originally introduced the wireless Mobile-IP protocol to support portable IP addresses for mobile devices that often change their network access points to the Internet. The inefficiency of this protocol mainly within the handoff management produces large end-to-end packet delays, during registration process, and further degrades the system efficiency due to packet losses between subnets. The criterion to initiate a simple and fast full-duplex connection between the home agent and foreign agent, to reduce the roaming duration, is a very important issue to be considered by a work in this paper. State-transition Petri-Nets of the modeling scenario-based CIA: communication inter-agents procedure as an extension to the basic Mobile-IP registration process was designed and manipulated. The heuristic of configuration file during practical Setup session for registration parameters, on Cisco platform Router-1760 using IOS 12.3 (15)T is created. Finally, stand-alone performance simulations results from Simulink Matlab, within each subnet and also between subnets, are illustrated for reporting better end-to-end packet delays. Results verified the effectiveness of our Mathcad analytical manipulation and experimental implementation. It showed lower values of end-to-end packet delay for Mobile-IP using CIA procedure. Furthermore, it reported packets flow between subnets to improve packet losses between subnets.Keywords: Cisco configuration, handoff, packet delay, Petri-Nets, registration process, Simulink.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13134281 A New Automatic System of Cell Colony Counting
Authors: U. Bottigli, M.Carpinelli, P.L. Fiori, B. Golosio, A. Marras, G. L. Masala, P. Oliva
Abstract:
The counting process of cell colonies is always a long and laborious process that is dependent on the judgment and ability of the operator. The judgment of the operator in counting can vary in relation to fatigue. Moreover, since this activity is time consuming it can limit the usable number of dishes for each experiment. For these purposes, it is necessary that an automatic system of cell colony counting is used. This article introduces a new automatic system of counting based on the elaboration of the digital images of cellular colonies grown on petri dishes. This system is mainly based on the algorithms of region-growing for the recognition of the regions of interest (ROI) in the image and a Sanger neural net for the characterization of such regions. The better final classification is supplied from a Feed-Forward Neural Net (FF-NN) and confronted with the K-Nearest Neighbour (K-NN) and a Linear Discriminative Function (LDF). The preliminary results are shown.Keywords: Automatic cell counting, neural network, region growing, Sanger net.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14614280 Auditory Brainstem Response in Wave VI for the Detection of Learning Disabilities
Authors: M.Victoria Garcia-Camba, M.Isabel Garcia-Planas
Abstract:
The use of brain stem auditory evoked potential (BAEP) is a common way to study the hearing function of people, a way to learn the functionality of a part of the brain neuronal groups that intervene in the learning process by studying the behaviour of wave VI. The latest advances in neuroscience have revealed the existence of different brain activity in the learning process that can be highlighted through the use of innocuous, low-cost and easy-access techniques such as, among others, the BAEP that can help us to detect early possible neurodevelopmental difficulties for their subsequent assessment and cure. To date and the authors best knowledge, only the latency data obtained, observing the first to V waves and mainly in the left ear, were taken into account. This work shows that it is essential to consider both ears; with these latest data, it has been possible to diagnose more precisely some cases than with the previous data had been diagnosed as “normal”despite showing signs of some alteration that motivated the new consultation to the specialist.
Keywords: Ear, neurodevelopment, auditory evoked potentials, intervals of normality, learning disabilities.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5074279 Augmented Reality Sandbox and Constructivist Approach for Geoscience Teaching and Learning
Authors: Muhammad Nawaz, Sandeep N. Kundu, Farha Sattar
Abstract:
Augmented reality sandbox adds new dimensions to education and learning process. It can be a core component of geoscience teaching and learning to understand the geographic contexts and landform processes. Augmented reality sandbox is a useful tool not only to create an interactive learning environment through spatial visualization but also it can provide an active learning experience to students and enhances the cognition process of learning. Augmented reality sandbox can be used as an interactive learning tool to teach geomorphic and landform processes. This article explains the augmented reality sandbox and the constructivism approach for geoscience teaching and learning, and endeavours to explore the ways to teach the geographic processes using the three-dimensional digital environment for the deep learning of the geoscience concepts interactively.
Keywords: Augmented Reality Sandbox, constructivism, deep learning, geoscience.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15234278 Scheduling for a Reconfigurable Manufacturing System with Multiple Process Plans and Limited Pallets/Fixtures
Authors: Jae-Min Yu, Hyoung-Ho Doh, Ji-Su Kim, Dong-Ho Lee, Sung-Ho Nam
Abstract:
A reconfigurable manufacturing system (RMS) is an advanced system designed at the outset for rapid changes in its hardware and software components in order to quickly adjust its production capacity and functionally. Among various operational decisions, this study considers the scheduling problem that determines the input sequence and schedule at the same time for a given set of parts. In particular, we consider the practical constraints that the numbers of pallets/fixtures are limited and hence a part can be released into the system only when the fixture required for the part is available. To solve the integrated input sequencing and scheduling problems, we suggest a priority rule based approach in which the two sub-problems are solved using a combination of priority rules. To show the effectiveness of various rule combinations, a simulation experiment was done on the data for a real RMS, and the test results are reported.Keywords: Reconfigurable manufacturing system, scheduling, priority rules, multiple process plans, pallets/fixtures
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1896